Ciber Global, LLC
Dearborn, MIData Engineer
Data Factory Engineers will be responsible for designing the transformation and modernization of big data solutions on GCP cloud integrating native GCP services and 3rd party data technologies. Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.
Key responsibilities include: Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of Data Platform. Implement methods for automation of all parts of the pipeline to minimize labor in development and production.
Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution . Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. . Migrate and productionalize existing Big Data pipelines into Google Cloud Platform
Experience Required:
5+ years coding skills in Java. . In-depth understanding of Google"s product technology and underlying architectures. Google Cloud Platform (GCP) Certification preferred. 5+ years of application development experience required, +3 years of GCP experience . Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, etc.