Charles Schwab Corporation
Westlake, TX (Remote)


Your Opportunity

We believe that, when done right, investing liberates people to create their own destiny. We are driven by our purpose to champion every client’s goals with passion and integrity. We respect and appreciate the diversity of our employees, our clients, and the communities we serve. We challenge conventions strategically to create value for our clients, our firm and the world. We live and bring to life the concept of ‘own your tomorrow’ every day. We champion our employee strengths, guide their development, and invest in their long-term success. We hire optimistic, results-oriented, curious, innovative, and adaptable people with the desire to help our clients and one another succeed.

As a company, we were established by Chuck over 40 years ago to champion Main Street over Wall Street, and to help Americans transform themselves from earners to owners. Through advocacy and innovation, we work to make investing more affordable, accessible and understandable for all. As we enter our fifth decade, we are looking for talented, innovative and driven people who believe they can help themselves, and our clients, create a better future.

WAM Data architecture & platform engineering team supports build out of core data platform (WAM-Ex) capabilities – cloud native data data platform (Big Query/Snowflake) and core data capabilities – orchestration, data security and data quality to be shared across Wealth and asset management. Define and build best practices and standards for federated development on WAM-Ex data platform, design consistent and connected logical and physical data models across data domains and design consistent data engineering lifecycle for building data assets across initiatives on WAM-Ex.

What you are good at

* Bachelor’s degree in computer science, information systems, math, engineering, or other technical field, or equivalent experience
* Six years of experience with Python or Java
* Four or more years of experience in building data lake, cloud data platform leveraging cloud (GCP/AWS) cloud native architecture, ETL/ELT, and data integration
* Three years of development experience with cloud services ( AWS,GCP,AZURE) utilizing various support tools (e.g. GCS, Dataproc, Cloud Data flow, Airflow(Composer), Kafka , Cloud Pub/Sub)
* Expertise in developing distributed data processing and Streaming frameworks and architectures (Apache Spark, Apache Beam, Apache Flink, )
* In-depth knowledge of NoSQL database technologies (e.g. MongoDB, BigTable, DynamoDB)
* Expertise in build and deployment tools – (Visual Studio, PyCharm, Git/Bitbucket/Bamboo, Maven, Jenkins, Nexus, )
* Five years of experience and expertise in database design techniques and philosophies (e.g. RDBMS, Document, Star Schema, Kimball Model)
* Five years of experience with integration and service frameworks (e.g API Gateways, Apache Camel, Swagger API, Zookeeper, Kafka, messaging tools, microservices)
* Expertise with containerized Microservices and REST/GraphQL based API development
* Experience leveraging continuous integration/development tools (e.g. Jenkins, Docker, Containers, OpenShift, Kubernetes, and container automation) in a Ci/CD pipeline
* Advanced understanding of software development and research tools
* Attention to detail and results oriented, with a strong customer focus
* Ability to work as part of a team and independently
* Analytical and problem-solving skills
* Problem-solving and technical communication skills
* Ability to prioritize workload to meet tight deadlines

What you have

* Work collaboratively with other engineers, architects, data scientists, analytics teams, and business product owners in an agile environment
* Architect, build, and support the operation of Cloud and On-Premises enterprise data infrastructure and tools
* Design and build robust, reusable, and scalable data driven solutions and data pipeline frameworks to automate the ingestion, processing and delivery of both structured and unstructured batch and real-time streaming data
* Build data APIs and data delivery services to support critical operational processes, analytical models and machine learning applications
* Assist in selection and integration of data related tools, frameworks, and applications required to expand platform capabilities
* Understand and implement best practices in management of enterprise data, including master data, reference data, metadata, data quality and lineage

Recommended Skills

  • Api
  • Adaptability
  • Agile Methodology
  • Airflow
  • Amazon Dynamo Db
  • Amazon Web Services