Hi,


We are currently hiring for the role of Sr. Data Bricks Solution Architect



TX/Plano- Local to Dallas area, in client office 3days/wk.

Rate:$90



*Please note that only profiles submitted with a CV and the requested
details and documents, as specified in the table below, will be considered
for further discussion.*



Role Summary

We are looking for a highly skilled Databricks Solution Architect to lead
the design and implementation of scalable, enterprise-grade data platforms
using Databricks. The ideal candidate will combine strong technical
expertise in data engineering and cloud platforms (AWS/Azure/GCP) with
architectural leadership, solution design capability, and strong
stakeholder engagement skills.



Key Responsibilities



1. Solution Architecture & Design

Design end-to-end data architectures using Databricks Lakehouse Platform.

Architect scalable ETL/ELT pipelines, real-time streaming solutions, and
advanced analytics platforms.

Define data models, storage strategies, and integration patterns aligned
with business and enterprise architecture standards.

Provide guidance on cluster configuration, performance optimization, cost
management, and workspace governance.



2. Technical Leadership

Lead technical discussions and design workshops with engineering teams and
business stakeholders.

Provide best practices, frameworks, and reusable component designs for
consistent delivery.

Perform code reviews and provide technical mentoring to data engineers and
developers.



3. Stakeholder & Project Engagement

Collaborate with product owners, business leaders, and analytics teams to
translate business requirements into scalable technical solutions.

Create and present solution proposals, architectural diagrams, and
implementation strategies.

Support pre-sales or discovery phases with technical input when needed.



4. Data Governance, Security & Compliance

Define and implement governance standards across Databricks workspaces
(data lineage, cataloging, access control, etc.).

Ensure compliance with regulatory and organizational security frameworks.

Implement best practices for monitoring, auditing, and data quality
management.



5. Continuous Improvement & Innovation

Stay updated on Databricks features, roadmap, and industry trends.

Recommend improvements, optimizations, and modernization opportunities
across the data ecosystem.

Evaluate integration of complementary technologies (Delta Live Tables,
MLflow, Unity Catalog, streaming frameworks, etc.).



Required Skills & Experience

Technical Skills

Databricks Expertise: Strong hands-on experience with Databricks (clusters,
notebooks, Delta Lake, MLflow, Unity Catalog).

Cloud Platforms: Experience with at least one cloud provider (AWS, Azure,
GCP).

Data Engineering: Strong proficiency in Spark, Python, SQL, and distributed
data processing.

Architecture: Experience designing large-scale data solutions including
ingestion, transformation, storage, and analytics.

Streaming: Experience with streaming technologies (Structured Streaming,
Kafka, Kinesis, EventHub).

DevOps: CI/CD practices for data pipelines (Azure DevOps, GitHub Actions,
Jenkins, etc.).

Soft Skills

Strong communication skills with the ability to engage both technical and
business teams.

Experience working in Agile environments.

Ability to simplify complex technical concepts for non-technical audiences.

Strong analytical, problem-solving, and decision-making abilities.



Preferred Qualifications

Databricks Certified Data Engineer Professional / Architect certification.

AWS/Azure/GCP cloud architect certifications.

Experience with BI tools (Tableau, Power BI, Looker).

Experience in machine learning workflows and ML operations.

Background in large-scale data modernization or cloud migration projects.



Why Join Us?

Opportunity to lead high-impact data initiatives using cutting-edge
Databricks capabilities.

Work with a highly skilled team of engineers, architects, and analytics
professionals.

Professional growth opportunities including certifications and advanced
architecture training.

Collaborative environment that values innovation and continuous improvement.

Deliverables

-Process Flows

-Mentor and Knowledge transfer to client project team members

-Participate as primary, co and/or contributing author on any and all
project deliverables associated with their assigned areas of responsibility

-Participate in data conversion and data maintenance

-Provide best practice and industry specific solutions

-Advise on and provide alternative (out of the box) solutions

-Provide thought leadership as well as hands on technical
configuration/development as needed.

-Participate as a team member of the team

-Perform other duties as assigned.

*Full Legal Name*



*Contact Information*



*Work Authorization Status*



*Copy of Work Authorization*



*Current Location*



*Willingness to Relocate*



*Availability to Start*



*Expected Rate*



*2 Professional References*



*LinkedIn Profile*



*FTF Interview available dates & Time*



*Last 4 digits of SSN*



*Driving License Copy*



*Passport copy*




*Thanks,*
*Lyra Dass*

*Human Resources ManagerDigital Resource Partners LLC*
*+1(945)248-3020*
*https://drpscorp.com/ <https://drpscorp.com/>*

-- 
You received this message because you are subscribed to "rtc-linux".
Membership options at http://groups.google.com/group/rtc-linux .
Please read http://groups.google.com/group/rtc-linux/web/checklist
before submitting a driver.
--- 
You received this message because you are subscribed to the Google Groups 
"rtc-linux" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/rtc-linux/CAEL7yFQ-faGkzSkqPDqduucQDO52h19Hq74e4Pe2dH9vzwMvDg%40mail.gmail.com.

Reply via email to