Software Engineer – Data
Location: Glendale, CA
Employment Type: Contract
Job ID: 140739
Date Added: 09/09/2024
Location: Glendale, CA – Hybrid Onsite Schedule
The Company
Headquartered in Los Angeles, this leader in the Entertainment & Media space is focused on delivering world-class stories and experiences to it’s global audience. To offer the best entertainment experiences, their technology teams focus on continued innovation and utilization of cutting edge technology.
Platform / Stack
You will work with technologies that include Python, AWS, Airflow and Snowflake.
Compensation Expectation– $180,000-$200,000k
What You’ll Do As a Sr Data Engineer:
- Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
- Build tools and services to support data discovery, lineage, governance, and privacy
- Collaborate with other software/data engineers and cross-functional teams
- Work on a Tech stack that includes Airflow, Spark, Databricks, Delta Lake, and Snowflake
- Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
- Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
- Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements
- Maintain detailed documentation of your work and changes to support data quality and data governance requirements
Qualifications
You could be a great fit if you have:
- 5+ years of data engineering experience developing large data pipelines
- Proficiency in at least one major programming language (e.g. Python, Java, Scala)
- Strong SQL skills and ability to create queries to analyze complex datasets
- Hands-on production environment experience with distributed processing systems such as Spark
- Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
- Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query).
- Experience in developing APIs with GraphQL
- Deep Understanding of AWS or other cloud providers as well as infrastructure as code
- Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
This client requires that a background check be completed. A background check is required to protect our company/client and its stakeholders by ensuring that we hire individuals with a trustworthy history, which helps maintain a safe and secure workplace. This proactive measure minimizes potential risks and promotes a culture of integrity within the organization.
Benefits Offered:
Employer provides access to:
- 3 levels of medical insurance for you and your family
- Dental insurance for you and your family
- 401k
- Overtime
- California has the following sick leave policy: accrue 1 hour for every 30 hours worked up to 48 hours. If you are based in a different state, please inquire about that state’s sick leave policy.