Job Title: Python/GCP Data Engineer
Location: Remote
Client: CVS Health
Compensation: $48- $55/hour (based on years of experience)
Overview:
We are looking for a highly skilled and experienced Python/GCP Data Engineer to join our team for a project with CVS Health. The ideal candidate should have a strong background in Python and Google Cloud Platform (GCP), with a proven track record of working on complex data engineering projects.
Responsibilities:
• Design, develop, and maintain scalable data pipelines on GCP.
• Collaborate with cross-functional teams to ensure data solutions align with business goals.
• Implement data storage solutions, process large-scale datasets, and optimize query performance.
• Develop and maintain documentation of processes and data flows.
• Ensure data security, integrity, and compliance with industry standards.
Requirements:
• Proven experience in Python development and GCP.
• Strong understanding of GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
• Expertise in designing and managing data pipelines.
• Familiarity with CI/CD pipelines and cloud-based data engineering best practices.
• Good communication skills and ability to work in a remote team setup.
Preferred Qualifications:
• US-based Master’s degree in a relevant field (nice to have but not required).
• Ability to showcase past work or profile on LinkedIn.
Application Process:
Please ensure that your LinkedIn profile is up to date (without a logo as a profile picture) and includes relevant project experience. Only candidates with genuine GCP experience and strong data engineering backgrounds will be considered.
Location: Remote
Client: CVS Health
Compensation: $48- $55/hour (based on years of experience)
Overview:
We are looking for a highly skilled and experienced Python/GCP Data Engineer to join our team for a project with CVS Health. The ideal candidate should have a strong background in Python and Google Cloud Platform (GCP), with a proven track record of working on complex data engineering projects.
Responsibilities:
• Design, develop, and maintain scalable data pipelines on GCP.
• Collaborate with cross-functional teams to ensure data solutions align with business goals.
• Implement data storage solutions, process large-scale datasets, and optimize query performance.
• Develop and maintain documentation of processes and data flows.
• Ensure data security, integrity, and compliance with industry standards.
Requirements:
• Proven experience in Python development and GCP.
• Strong understanding of GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
• Expertise in designing and managing data pipelines.
• Familiarity with CI/CD pipelines and cloud-based data engineering best practices.
• Good communication skills and ability to work in a remote team setup.
Preferred Qualifications:
• US-based Master’s degree in a relevant field (nice to have but not required).
• Ability to showcase past work or profile on LinkedIn.
Application Process:
Please ensure that your LinkedIn profile is up to date (without a logo as a profile picture) and includes relevant project experience. Only candidates with genuine GCP experience and strong data engineering backgrounds will be considered.