- Career Center Home
- Search Jobs
- GCP Data Engineer
Results
Job Details
Explore Location
ITL USA
Richardson, Texas, United States
(on-site)
Industry Categories
Other
Job Function
Other
GCP Data Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
GCP Data Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Description
Infosys is seeking a Google Cloud (GCP) data engineer with experience in Github and python. In this role, you will enable digital transformation for our clients in a global delivery model, research on technologies independently, recommend appropriate solutions and contribute to technology-specific best practices and standards. You will be responsible to interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.Required Qualifications:
- Candidate must be located within commuting distance of Richardson, TX or be willing to relocate to the area. This position may require travel in the US
- Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
- Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time
- At least 4 years of Information Technology experience.
- Experience working with technologies like - GCP with data engineering - data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query.
- ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka
- Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks.
- Application build experience with core GCP Services like Dataproc, GKE, Composer,
- Deep understanding GCP IAM & Github.
- Must have done IAM set up
- Knowledge on CICD pipeline using Terraform in Git.
- Good knowledge on Google Big Query, using advance SQL programing techniques to build Big Query Data sets in Ingestion and Transformation layer.
- Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
- Knowledge on Airflow Dag creation, execution, and monitoring.
- Good understanding of Agile software development frameworks
- Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams.
- Experience and desire to work in a global delivery environment.
Job ID: 80922074
Please refer to the company's website or job descriptions to learn more about them.
View Full Profile
More Jobs from ITL USA
GCP Data Engineer
Irving, United States
11 hours ago
DevOps Engineer -Azure
Anywhere In Chile, Chile
12 hours ago
Java Full Stack Microservices React Developer
Atlanta, Georgia, United States
11 hours ago
View your connections
Jobs You May Like
Community Intel Unavailable
Details for Richardson, Texas, United States are unavailable at this time.
Loading...
