logo

View all jobs

Cloud Engineer - Consulting Level

Nashville, TN
Classification: Contract-To-Hire
Contract Length: 12 Months
Job ID: 15675086

Location: 2555 Park Plaza, Building 4 Nashville, TN 37203
   
At CereCore, our heart for healthcare is interconnected with our knowledge of technical solutions, creating a vital link that ultimately drives the delivery of high-quality care. 

CereCore is seeking a Cloud Data Engineer – Consulting Level to join our team in Nashville, TN This individual will work at one of the nation's leading providers of healthcare services, HCA Healthcare.

Overview:


This position sits within the Big Data Development team of our Information Management & Data Analytics Organization. The Consulting Cloud Data Engineer will provide architectural solutions and development for healthcare projects and best practices for Google cloud-based service development across the Data & Analytics as well as oversee other development within the team.  Provide solutions for the business problem, platform integration with third party services, designing and developing complex features for business needs specifically with Google Cloud Platform.

This position will be responsible for development, transformation and modernization of enterprise data solutions on Google Cloud Platform (GCP) integrating native GCP services and 3rd party data technologies. A solid experience and understanding of considerations for large scale development, and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.  The candidate must be a highly motivated self-starter and be committed to delivering high quality solutions within agreed upon timelines.  The focus of this role will be to help design and development the next generation of our data pipeline.

This person will be required to successfully collaborate and communicate with internal and external business leaders, domain SME’s, and technical staff in order to understand overarching business needs and subsequently, map these business and technological needs toward reusable patterns that can be easily expanded and augmented as technology and business needs transform.

Responsibilities:
 
  • Bring new data sources into GCP, transform and load to BigQuery
  • Responsible for building and supporting a GCP ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data.
  • Experience in building automated CI/CD deployment processes for applications running in the cloud.
  • Design, develop and implement a migration plan for moving the ELT and ETL pipelines to the more scalable and manageable services provided by Google Cloud.
  • Participates proactively with corporate and division IT management to optimize the use of technology in support of business strategies.
  • Communicate consistently with direct leadership and company leadership as to issues, concerns and/or potential road blocks that may occur.
  • Work with engineering team, DevOps team, IAM, Security and EDH team to implement cloud-based solutions
  • Closely collaborates with team members to successfully execute development initiatives using Agile practices and principles
  • Deep understanding of storage, compute, curated, network, and security on GCP
  • Develop and document cloud infrastructure designs for integration and implementation of new cloud applications and systems
  • Experience with micro service architectures and service fabric development and deployments.
  • Experience in cloud native, deploying within cloud environments, serverless development and/or container-based solutions.
     
Position Requirements:
 
  • Bachelor's degree in Information Technology, Computer Science or related field with at least 7 years of IT work experience
  • Strong understanding of best practices and standards for GCP data pipelines design and implementation.
  • 2 Years of hands-on experience with GCP platform and experience with many of the following components: GCS, GKE, Cloud Run, Cloud Functions, Bigtable, Firestore, Cloud SQL, Kafka, Pub/Sub, Python, Spark, Scala or Java, BigQuery, Cloud Composer, Dataflow, Dataproc, Spanner, Data Fusion, Cloud Build, Cloud Scheduler, OpenShift, Docker
  • Manages and prioritizes time with extreme efficiency.
  • Experience in relational databases like SQL Server, Teradata and document-based NoSQL database like MongoDB, Cassandra, or Cosmos
  • Excellent written and oral communication capability and presentation skills; persuasive, encouraging, motivating, and inspiring; the ability to listen and understand.
  • Flexibility and creativity to be success in meeting exceedingly tight project timelines.
  • Experience with GCP Cloud migration, private, hybrid or public cloud technology.
  • Skill in exercising initiative, judgment, problem solving, decision-making.
  • Knowledge of healthcare – preferred.
CereCore was formed in 2001 as a shared service business within a large hospital operator.  We focus solely on helping healthcare organizations align business and IT strategies to improve processes and patient care. 
 
Awards and Recognition

Our Commitment to Diversity and Inclusion
We believe excellence in healthcare starts with a foundation of inclusion, compassion and respect for our patients and each other.  We are committed to fostering a culture of inclusion across all areas of our organization.  We are an equal opportunity employer and we value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
 
Connect with us on 
LinkedInTwitter, and Facebook.

#Dice
 
Powered by