Location: REMOTE
Salary: $50.00 USD Hourly - $55.00 USD Hourly
Description: Our client is currently seeking a Data Warehouse Developer
Job Description:
• Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as BigQuery, Cloud Storage, and Cloud Composer.
• Develop and optimize SQL queries to support data extraction, transformation, and loading (ETL) processes.
• Collaborate with cross-functional teams, including business customers, Subject Matter Experts, to understand data requirements and deliver effective solutions.
• Implement best practices for data quality, data governance and data security.
• Monitor and troubleshoot data pipeline issues, ensuring high availability and performance.
• Contribute to data architecture decisions to provide recommendations for improving the data pipeline.
• Stay up to date with emerging trends and technologies in cloud-based data engineering and cyber security.
• Exceptional communication skills, including the ability to gather relevant data and information, actively listen, dialogue freely, and verbalize ideas effectively.
• Ability to work in an Agile work environment to deliver incremental value to customers by managing and prioritizing tasks.
• Proactively lead investigation and resolution efforts when data issues are identified taking ownership to resolve them in a timely manner.
• Ability to interoperate and document processes and procedures for producing metrics.
Must Have
• Bachelor's or master's degree in computer science, Information Systems, Engineering, or related field.
• 3 - 5 years of hands-on experience with data management in gathering data from multiple sources and consolidating them into a single centralized location. Transforming the data with business logic in a consumable manner for visualization and data analysis.
• Strong expertise in Google BigQuery, Google Cloud Storage, Cloud Composer, and related Google Cloud Platform (GCP) services.
• Proficiency in Python and SQL for data processing and automation.
• Experience with ETL processes and data pipeline design.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration.
Nice to Have
• Experience with other GCP Services like Dataflow, Pub/Sub, or Data Studio.
• Knowledge of DevOps practices and tools such as Terraform.
• Familiarity with data visualization tools such as Tableau, Grafana, and/or Looker.
• Understanding Cyber Security data tools and analysis methodologies.
• One or more of the following certifications preferred: CompTIA Sec+, CISM, CISSP, CRISC and / or CISA.
• Understanding of security control frameworks such as NIST, CIS controls, COBIT, ISO etc.
Contact: tbennett@judge.com
This job and many more are available through The Judge Group. Find us on the web at www.judge.com