AWS Data Architect/Databricks (Onsite) Job at Cognizant, Hartford, CT

S0xXMUNCZ0xQbGNwcElzM3htenBLbk52aFE9PQ==
  • Cognizant
  • Hartford, CT

Job Description

AWS Data Architect/Databricks (Onsite) at Cognizant summary:

The AWS Data Architect/Databricks role requires an experienced professional with 10 to 13 years in designing and implementing data architecture solutions using Spark, Scala, and Databricks technologies. The candidate will oversee data workflows, implement structured streaming solutions, and provide guidance in risk management and data security, all while supporting the Property & Casualty Insurance domain. The position demands strong technical skills, collaboration with multi-functional teams, and a commitment to continuous improvement in data architecture processes.

We are Cognizant Artificial Intelligence

Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. But clients need new business models built from analyzing customers and business operations at every angle to really understand them.

With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate, and scale the most desirable products and delivery models to enterprise scale within weeks.

* You must be legally authorized to work in United States without the need of employer sponsorship, now or at any time in the future *

This is an onsite position open to any qualified applicant in the United States

Job Title: AWS Data Architect - Databricks

Job summary:

We are seeking an experienced Architect with 10 to 13 years of experience to join our team. The ideal candidate will have extensive technical skills in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. Additionally experience in the Property & Casualty Insurance domain is mandatory

Roles/Responsibilities

  • Own the design and implementation of data architecture solutions using Spark in Scala and Databricks technologies.
  • Supervise the development and deployment of Delta Sharing and Databricks Unity Catalog Admin.
  • Guide in Databricks CLI and Delta Live Pipelines to streamline data workflows.
  • Implement and handle Structured Streaming solutions to ensure real-time data processing.
  • Apply risk management principles to ensure data security and compliance.
  • Use Apache Airflow for orchestrating sophisticated data workflows.
  • Lead data storage and retrieval using Amazon S3 and Amazon Redshift.
  • Develop and maintain Python scripts for data processing and automation.
  • Build and optimize Databricks SQL queries for efficient data analysis.
  • Implement Databricks Delta Lake for scalable and reliable data lakes.
  • Craft and manage Databricks Workflows to automate data pipelines.
  • Apply PySpark for large-scale data processing and analytics.
  • Collaborate with multi-functional teams to ensure data solutions meet business requirements.
  • Provide technical guidance and mentorship to junior team members.
  • Ensure all solutions enforce to industry best practices and company standards.
  • Give to the continuous improvement of data architecture processes and methodologies.
  • Stay updated with the latest industry trends and technologies to drive innovation.
  • Ensure the architecture solutions align with the company goals and objectives.
  • Support the Property & Casualty Insurance domain with tailored data solutions.
  • Deliver high-quality scalable and maintainable data architecture solutions.
  • Ensure the hybrid work model is effectively used for efficient productivity.

Qualifications

  • Extensive experience with Spark in Scala and Databricks technologies.
  • Proficiency in Delta Sharing and Databricks Unity Catalog Admin.
  • Expertise in Databricks CLI and Delta Live Pipelines.
  • Strong knowledge of Structured Streaming and risk management.
  • Experience with Apache Airflow Amazon S3 and Amazon Redshift.
  • Proficiency in Python and Databricks SQL.
  • Experience with Databricks Delta Lake and Databricks Workflows.
  • Solid skills in PySpark for data processing and analytics.
  • Mandatory experience in the Property & Casualty Insurance domain.
  • Ability to work effectively in a hybrid work model.
  • Strong problem-solving and analytical skills.
  • Superb communication and teamwork abilities.
  • Dedication to continuous learning and professional development.

Certifications Required

Databricks Certified Data Engineer Associate AWS Certified Solutions Architect Apache Airflow Certification

Salary and Other Compensation :

Applications will be accepted until January 16, 2025.

The annual salary for this position is between $81,000 – $140,000 depending on experience and other qualifications of the successful candidate.

This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.

Benefits : Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan

Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

#LI-EV1 #CB #Ind123

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Keywords:

AWS Data Architect, Databricks, Spark, Data Architecture, Delta Lake, Data Solutions, Property & Casualty Insurance, Python, Apache Airflow, Structured Streaming

Job Tags

Holiday work, Temporary work,

Similar Jobs

Aspire Health Group

Social Services Assistant Job at Aspire Health Group

 ...Looking for qualified Social Services Assistant to join our family! We are searching for a Social Services Assistant to join our community...  ...of resident problems/needs. Provide/arrange for social work services as indicated by resident/family needs. Assist resident... 

Allied Universal®

Security Guard - Corporate Facility Job at Allied Universal®

 ...Sunday 3pm-11pm Pay: $20.00 / hour Requirements: Previous Corporate environment experience, Security experience, & computer...  ...condition of employment, applicants will be subject to a background investigation in accordance with all federal, state, and local laws. Allied... 

Amazon Advertising LLC

Brand Designer, Brand Innovation Lab Job at Amazon Advertising LLC

 ...DESCRIPTION Amazon Ad's Brand Innovation Lab is looking for a talented and passionate Brand Designer with a track record of timely delivery, ownership, and an ability to execute unique deliverables for our advertising customers. Working across industries and design... 

The Wellington Agency

Household Manager/ Executive Housekeeper Job at The Wellington Agency

Start: late-January. Interviews: 16th and 17th December Trials: 13-21st January Schedule: Monday through Friday, 8am - 3pm. No weekends Location: Venetian Islands, Miami FL Live-out Private Family of 2 adults in the Venetian Islands in Miami, FL are looking...

CARLE

Medical Lab Tech MLS/MLT Job at CARLE

The Medical Laboratory Technician performs and evaluates complex chemical, biological, hematological, immunologic, microscopic and bacteriological analysis. Ensures communication of test results as appropriate. Able to work in a fast-paced, high volume environment....