Where Data Engineering Meets Excellence
Come work with top data experts to tackle real-life data problems!
Our Approach To Data
AI, ML, DS & DA projects are cool, but our mission is to transition projects from the lab to real-world impact. However, without hardcore engineering is not possible and, SunnyData is built by engineers, for engineers.
We start at Data Engineering, ensuring a robust groundwork before advancing to AI, machine learning, data science and analytics. This journey, from foundational engineering to transformative applications, is central to our mission.
“Contrary to common belief, successful analytics and AI require a strong Data Engineering base. At SunnyData, we commit to transforming AI aspirations into reality, backed by solid engineering expertise.” Kai Thapa, CEO.
Our Core Non-Negotiable Values
-
1. Unity
Our unity is our pillar, thriving on reciprocal support, and we provide a nurturing space where taking chances and expressing uncertainties is the norm. We encourage asking questions, learning from setbacks, and confidently voicing ideas. Our power comes from our community, combining different abilities, insights, and concepts to develop outstanding solutions.
-
2. Kaizen
Our drive is rooted in a profound desire to deeply understand – our technology, our team, and our clients. We delve into the intricacies of technology, nurture empathy amongst our team, and dedicate ourselves to comprehending our clients' needs, ensuring the delivery of both technical excellence and meaningful business outcomes.
-
3. Resilience
We mix passion with hard work to creatively solve tough problems. Our strong determination helps us overcome challenges and deliver great results for our clients, always aiming to do more than expected in our pursuit of top-notch quality.
-
4. Innovation
Innovation is the heartbeat of SunnyData. We constantly explore new ideas, push boundaries, and embrace cutting-edge technologies. This value is about more than just being creative; it's about transforming novel ideas into practical solutions. Innovation keeps us at the forefront of data engineering, ensuring we always offer the latest and most effective solutions.
The Talent You Will Work With
Our Technology Stack
-
We help customers build a highly scalable architecture, robust Data engineering pipelines, easy data consumption layers and more importantly build ML and AI applications to power their business and drive outstanding business outcomes.
-
We choose Databricks because it has the widest, deepest and most promising data platform to help customers exceed their business goals. Databricks is at the forefront of addressing the entire lifecycle of Data and AI needs, becoming more vital with advancements like ChatGPT. By rapidly solving Data Engineering and AI/ML challenges, it's reshaping data analytics and AI into an integrated platform for future innovation. Databricks provides scalability, integration with any cloud, cutting-edge AI and DS capabilities and flexible pricing mode. Organizations migrate to Databricks to streamline data operations, enhance analytics, and embed AI into their workflows, driving substantial business outcomes.
-
We've curated a powerful ecosystem of technology partners that seamlessly integrate with the Databricks Lakehouse Platform. These tools enhance reliability, scalability, and empower you to achieve ambitious targets and maximize ROI.
Alongside Databricks, our tech stack features Fivetran, Prophecy, MonteCarlo & Sigma.
Job Board
Check out the latest available positions at SunnyData
Senior Solutions Architect - USA - Remote
-
About SunnyData: At SunnyData, a leading Databricks technology partner, our mission is to empower customers with scalable architectures, robust data engineering pipelines, seamless data consumption layers, and advanced ML and AI applications. As a Senior Solutions Architect, you will be instrumental in driving customer success, both in terms of technical execution and pre-sales support.
-
Customer Engagement: Serve as a trusted advisor to customers, guiding them through their data engineering and data architecture needs with a focus on Databricks solutions. In this role you will split your time evenly between billable and pre-sales activity.
Technical Leadership: Design, build, and deploy comprehensive data solutions that capture, transform, and leverage data to support AI, ML, and business intelligence initiatives.
Pre-Sales Support: Collaborate with sales teams to present technical solutions to prospective clients, demonstrating the value of SunnyData's offerings.
Project Oversight: Manage multiple customer accounts, ensuring timely delivery of solutions and tracking progress to report outcomes.
Solution Design: Architect data solutions, incorporating best practices in data governance, security, and quality.
Data Analysis: Evaluate data sources for their value, recommending data inclusion strategies to enhance analytical processes.
Cross-Functional Collaboration: Work closely with internal teams to deliver solutions and educate end users on data products and analytic environments.
Problem Resolution: Perform system analysis, assess and resolve data and system defects, and apply appropriate corrections.
Quality Assurance: Test data movement, transformation code, and data components to ensure accuracy and reliability.
-
Required Experience:
Experience: 5-7+ years as a hands-on Solutions Architect and/or Sr. Data Engineer designing and more recent experience implementing data solutions with a focus on DataBricks.
Technical Proficiency: Expertise in Data Engineering technologies (e.g., Spark, Hadoop, Kafka), Databricks platform, and data science/machine learning technologies (e.g., pandas, scikit-learn).
Architecture and leadership skills: In-depth understanding of the end to end data analytics workflow (e.g., data modeling, ETL processes, and data integration) using modern data engineering techniques; Ability to lead complex architecture requirements(discovery), solution design sessions and build out implementation architecture blueprints that can be implemented by data-engineering and analytics teams.
Certifications (at least 2 of the following): Associate Developer for Apache Spark; Data Engineer Associate; Professional Data Engineer; Machine Learning Associate; Professional ML Engineer
Programming Skills: Proficiency in Java, Python, and/or Scala.
Cloud Platforms: AWS, Azure, and/or GCP.
SQL Expertise: Ability to write, debug, and optimize SQL queries.
Client-Facing Skills: Strong written and verbal communication skills with experience in client-facing roles.
Presentation Skills: Ability to create and deliver detailed presentations to clients and stakeholders.
Documentation: Experience in creating detailed solution documentation including POCs, roadmaps, sequence diagrams, class hierarchies, and logical system views.
Team Leadership: Experience leading teams and mentoring other engineers.
End-to-End Solutions: Ability to develop end-to-end technical solutions into production, ensuring performance, security, scalability, and robust data integration.
Preferred Experience:Distributed Storage: Familiarity with cloud and distributed data storage systems such as S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems.
Data Integration: Experience with data integration technologies like Spark, Kafka, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, and Informatica Intelligent Cloud Services (IICS).
Software Development Lifecycle: Comprehensive experience with the complete software development lifecycle including design, documentation, implementation, testing, and deployment.
Automated Pipelines: Expertise in automated data transformation and curation using tools like dbt, Spark, Spark streaming, and automated pipelines.
Workflow Management: Experience with workflow management and orchestration tools like Airflow, AWS Managed Airflow, Luigi, and NiFi.
Education: 4-year Bachelor's degree in Computer Science or a related field.
-
Benefits
Health Insurance : Employees and their eligible family members including spouses, domestic partners, and children are eligible for coverage from the first day of employment.
Paid Time Off : Start your career at SunnyData with a minimum of 20 days Paid Time Off annually, plus nine paid company Holidays.
An opportunity to grow your technical and people skills, lead teams on complex customer projects that are highly innovative and cutting edge. Great opportunity to grow in your career with the right level of focus, innovation and customer centricity with a high growth consulting company dedicated to Databricks.
Compensation Overview
The annual base salary range provided for this position is a nationwide market range and represents a broad range of salaries for this role across the country. The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. As part of our comprehensive compensation and benefits program, employees are also eligible for performance-based cash incentive awards.
Our Commitment to Diversity and Inclusion
At SunnyData, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at SunnyData are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.
Senior Data Engineer - USA - Remote
-
At SunnyData, our mission is to help customers build a highly scalable architecture, robust data engineering pipelines, easy data consumption layers and more importantly build ML and AI applications to power their business and drive outstanding business outcomes. As a Senior Data Engineer, you will play a critical role during this customer journey. You will directly work with internal teams and customers to design, build and deploy data solutions that capture, explore, transform, and utilize data to support Artificial Intelligence, Machine Learning and business intelligence / insights.
-
You will be part of a team responsible for supporting new and existing customers in their data engineering needs.
You will guide customers to make the best technical decisions to achieve their goals
You will actively work across multiple customer accounts which you would need to track and report on their progress.
You will build and operationalize complex data solutions, correct problems, apply transformations, and recommend data cleansing / quality solutions.
You will design data solutions.
You will analyze sources to determine value and recommend data to include in analytical processes.
You will incorporate core data management competencies including data governance, data security and data quality.
You will collaborate within and across teams to support delivery and educate end users on data products / analytic environments.
You will perform data and system analysis, assessment and resolution for defects and incidents of moderate complexity and correct as appropriate.
You will test data movement, transformation code, and data components.
-
Bachelor’s Degree in STEM related field or equivalent
4 to 6 years of related experience in data engineering and data product development.
Experience in one or more of the following:
Data Engineering technologies (e.g., Spark, Hadoop, Kafka)
Databricks platform - data engineering and/or ML ops experience would be a huge bonus
Data Science and Machine Learning technologies (e.g., pandas, scikit-learn, HPO)
Data Warehousing (e.g., SQL, OLTP/OLAP/DSS)
Solid understanding of the end to end data analytics workflow
Demonstrated track record of domain expertise including the ability to understand technical concepts and possess in-depth knowledge of immediate systems worked on
Proven problem solving skills including debugging skills
Strong verbal and written communication skills
Leadership - Intermediate leadership skills with a proven track record of self-motivation in identifying personal growth opportunities
Excellent time management and prioritization skills Knowledge of public cloud platforms AWS, Azure or GCP would be a plus
Nice to have: Databricks Certification
-
Health Insurance : Employees and their eligible family members including spouses, domestic partners, and children are eligible for coverage from the first day of employment.
Paid Time Off : Start your career at SunnyData with a minimum of 20 days Paid Time Off annually, plus nine paid company Holidays.
An opportunity to grow your technical and people skills, lead teams on complex customer projects that are highly innovative and cutting edge. Great opportunity to grow in your career with the right level of focus, innovation and customer centricity with a high growth consulting company dedicated to Databricks.
-
The annual base salary range provided for this position is a nationwide market range and represents a broad range of salaries for this role across the country. The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. As part of our comprehensive compensation and benefits program, employees are also eligible for performance-based cash incentive awards.
Salary Range - TBD
Junior Data Engineer - Uruguay - Hybrid
-
Are you ready to jump start your career in the exciting and innovative data engineering field?
At SunnyData our mission is to help our customers in their data journey to power their business. We believe a solid foundation is the key to achieving outstanding data driven results: highly scalable architectures, robust data engineering pipelines, simple data consumption layers and accurate ML and AI applications. As an Associate Data Engineer, you will play a critical role during this customer journey. You will directly work with internal teams and customers to design, build and deploy data solutions that capture, explore, transform, and utilize data to support Artificial Intelligence, Machine Learning and Business Intelligence insights.
-
Be part of a team responsible for supporting new and existing customers in their data engineering needs.
Guide customers to make the best technical decisions to achieve their goals.
Actively work across multiple customer accounts which you would need to track and report on their progress.
Design, build and operationalize complex data solutions, fix issues, apply transformations, and recommend data cleansing / quality solutions.
Analyze data sources to determine their value and recommend them to include in analytical processes.
Incorporate core data management competencies including data governance, data security and data quality.
Collaborate within and across teams to support delivery and educate end users on data products / analytic environments.
Perform data and system analysis, assessment and resolution for defects and incidents of moderate complexity and correct as appropriate.
Test data movement, transformation code, and data components.
-
Bachelor’s Degree or advanced level student in Computer Science or Engineering related field.
Highly motivated individual with a continuous learning mindset. Excellent time management and prioritization skills.
Strong English verbal and written communication skills.
Partial (6h) or full time (8h) availability.
Familiarity with public cloud platforms AWS, Azure or GCP.
Nice to have: Experience in one or more of the following
a. Data Engineering technologies (e.g., Spark, Hadoop, Kafka)
b. Data Science and Machine Learning technologies (e.g., pandas, scikit-learn, HPO)
c. Data Warehousing (e.g., SQL, OLTP/OLAP/DSS)
Nice to have: Databricks data engineering or machine learning certification.
-
1400 - 2000 USD.
Senior Data Engineer - Uruguay - Hybrid
-
At SunnyData, our mission is to help customers build a highly scalable architecture, robust data engineering pipelines, easy data consumption layers and more importantly build ML and AI applications to power their business and drive outstanding business outcomes. As a Senior Data Engineer, you will play a critical role during this customer journey. You will directly work with internal teams and customers to design, build and deploy data solutions that capture, explore, transform, and utilize data to support Artificial Intelligence, Machine Learning and business intelligence / insights.
-
You will be part of a team responsible for supporting new and existing customers in their data engineering needs.
You will guide customers to make the best technical decisions to achieve their goals
You will actively work across multiple customer accounts which you would need to track and report on their progress.
You will build and operationalize complex data solutions, correct problems, apply transformations, and recommend data cleansing / quality solutions.
You will design data solutions.
You will analyze sources to determine value and recommend data to include in analytical processes.
You will incorporate core data management competencies including data governance, data security and data quality.
You will collaborate within and across teams to support delivery and educate end users on data products / analytic environments.
You will perform data and system analysis, assessment and resolution for defects and incidents of moderate complexity and correct as appropriate.
You will test data movement, transformation code, and data components.
-
Bachelor’s Degree in Computer Science or Engineering related field.
+3 years of related experience in data engineering and data product development.
Experience in one or more of the following:
a. Data Engineering technologies (e.g., Spark, Hadoop, Kafka)
b. Data Science and Machine Learning technologies (e.g., pandas, scikit-learn, HPO)
c. Data Warehousing (e.g., SQL, OLTP/OLAP/DSS)
Solid understanding of the end to end data analytics workflow.
Demonstrated track record of domain expertise including the ability to understand technical concepts and possess in-depth knowledge of immediate systems worked on.
Proven problem solving skills including debugging skills.
Strong English verbal and written communication skills.
Leadership - Intermediate leadership skills with a proven track record of self-motivation in identifying personal growth opportunities.
Excellent time management and prioritization skills.
Nice to have: knowledge of public cloud platforms AWS, Azure or GCP.
Nice to have: Databricks data engineering or machine learning certification.
-
3000 - 4500 USD.