2+ Data warehouse architecture Jobs in Pune | Data warehouse architecture Job openings in Pune
Apply to 2+ Data warehouse architecture Jobs in Pune on CutShort.io. Explore the latest Data warehouse architecture Job opportunities across top companies like Google, Amazon & Adobe.
Solutions Architect - Data Engineering
Modern tech solutions advisory & 'futurify' consulting as a Searce lead fds (‘forward deployed solver’) architecting scalable data platforms and robust data engineering solutions that power intelligent insights and fuel AI innovation.
If you’re a tech-savvy, consultative seller with the brain of a strategist, the heart of a builder, and the charisma of a storyteller — we’ve got a seat for you at the front of the table.
You're not a sales lead. You're the transformation driver.
What are we looking for
real solver?
Solver? Absolutely. But not the usual kind. We're searching for the architects of the audacious & the pioneers of the possible. If you're the type to dismantle assumptions, re-engineer ‘best practices,’ and build solutions that make the future possible NOW, then you're speaking our language.
- Improver. Solver. Futurist.
- Great sense of humor.
- ‘Possible. It is.’ Mindset.
- Compassionate collaborator. Bold experimenter. Tireless iterator.
- Natural creativity that doesn’t just challenge the norm, but solves to design what’s better.
- Thinks in systems. Solves at scale.
This Isn’t for Everyone. But if you’re the kind who questions why things are done a certain way— and then identifies 3 better ways to do it — we’d love to chat with you.
Your Responsibilities
what you will wake up to solve.
You are not just a Solutions Architect; you are a futurifier of our data universe and the primary enabler of our AI ambitions. With a deep-seated passion for data engineering, you will architect and build the foundational data infrastructure that powers the customers entire data intelligence ecosystem.
As the Directly Responsible Individual (DRI) for our enterprise-grade data platforms, you own the outcome, end-to-end. You are the definitive solver for our customer's most complex data challenges, leveraging a powerful tech stack including Snowflake, Databricks, etc. and core GCP & AWS services (BigQuery, Spanner, Airflow, Kafka). This is a hands-on-keys role where you won't just design solutions—you'll build them, break them, and perfect them.
- Solution Design & Pre-sales Excellence:Collaborate with cross-functional teams, including sales, engineering, and operations, to ensure successful project delivery.
- Design Core Data Engineering: Master data modeling, architecting high-performance data ingestion pipelines and ensuring data quality and governance throughout the data lifecycle.
- Enable Cloud & AI: Design and implement solutions utilizing core GCP data services, building foundational data platforms that efficiently support advanced analytics and AI/ML initiatives.
- Optimize Performance & Cost: Continuously optimize data architectures and implementations for performance, efficiency, and cost-effectiveness within the cloud environment.
- Bridge Business & Tech: Translate complex business requirements into clear technical designs, providing technical leadership and guidance to data engineering teams.
- Stay Ahead of the Curve: Continuously research and evaluate new data technologies, architectural patterns, and industry trends to keep our data platforms at the cutting edge.
Functional Skills:
- Enterprise Data Architecture Design: Expert ability to design holistic, scalable, and resilient data architectures for complex enterprise environments.
- Cloud Data Platform Strategy: Proven capability to strategize, design, and implement cloud-native data platforms.
- Pre-Sales & Technical Storyteller: Crafts compelling, client-ready proposals, architectural decks, and technical demonstrations. Doesn't just present; shapes the strategic technical narrative behind every proposed solution.
- Advanced Data Modelling: Mastery in designing various data models for analytical, operational, and transactional use cases.
- Data Ingestion & Pipeline Orchestration: Strong expertise in designing and optimizing robust data ingestion and transformation pipelines.
- Stakeholder Communication: Exceptional skills in articulating complex technical concepts and architectural decisions to both technical and non-technical stakeholders.
- Performance & Cost Optimization: Adept at optimizing data solutions for performance, efficiency, and cost within a cloud environment.
Tech Superpowers:
- Cloud Data Mastery: You're a wizard at leveraging public cloud data services, with deep expertise in GCP (BigQuery, Spanner, etc.) and expert proficiency in modern data warehouse solutions like Snowflake.
- Data Engineering Core: Highly skilled in designing, implementing, and managing data workflows using tools like Apache Airflow and Apache Kafka. You're also an authority on advanced data modeling and ETL/ELT patterns.
- AI/ML Data Foundation: You instinctively design data pipelines and structures that efficiently feed and empower Machine Learning and Artificial Intelligence applications.
- Programming for Data: You have a strong command over key programming languages (Python, SQL) for scripting, automation, and building data processing applications.
Experience & Relevance:
- Architectural Leadership (8+ Years): You bring extensive experience (7+ years) specifically in a Solutions Architect role, focused on data engineering and platform building.
- Cloud Data Expertise: You have a proven track record of designing and implementing production-grade data solutions leveraging major public cloud platforms, with significant experience in Google Cloud Platform (GCP).
- Data Warehousing & Data Platform: Demonstrated hands-on experience in the end-to-end design, implementation, and optimization of modern data warehouses and comprehensive data platforms.
- Databricks & BigQuery Mastery: You possess significant practical experience with Databricks as a core data warehouse and GCP BigQuery for analytical workloads.
- Data Ingestion & Orchestration: Proven experience designing and implementing complex data ingestion pipelines and workflow orchestration using tools like Airflow and real-time streaming technologies like Kafka.
- AI/ML Data Enablement: Experience in building data foundations specifically geared towards supporting Machine Learning and Artificial Intelligence initiatives.
Join the ‘real solvers’
ready to futurify?
If you are excited by the possibilities of what an AI-native engineering-led, modern tech consultancy can do to futurify businesses, apply here and experience the ‘Art of the possible’.
Don’t Just Send a Resume. Send a Statement.
So, If you are passionate about tech, future & what you read above (we really are!), apply here to experience the ‘Art of Possible’
Enterprise Data Architect - Dataeconomy (25+ Years Experience)
About Dataeconomy:
Dataeconomy is a rapidly growing company at the forefront of Information Technology. We are driven by data and committed to using it to make better decisions, improve our products, and deliver exceptional value to our customers.
Job Summary:
Dataeconomy seeks a seasoned and strategic Enterprise Data Architect to lead the company's data transformation journey. With 25+ years of experience in data architecture and leadership, you will be pivotal in shaping our data infrastructure, governance, and culture. You will leverage your extensive expertise to build a foundation for future growth and innovation, ensuring our data assets are aligned with business objectives and drive measurable value.
Responsibilities:
Strategic Vision and Leadership:
Lead the creation and execution of a long-term data strategy aligned with the company's overall vision and goals.
Champion a data-driven culture across the organization, fostering cross-functional collaboration and data literacy.
Advise senior leadership on strategic data initiatives and their impact on business performance.
Architecture and Modernization:
Evaluate and modernize the existing data architecture, recommending and implementing innovative solutions.
Design and implement a scalable data lake/warehouse architecture for future growth.
Advocate for and adopt cutting-edge data technologies and best practices.
ETL Tool Experience (8+ years):
Extensive experience in designing, developing, and implementing ETL (Extract, Transform, Load) processes using industry-standard tools such as Informatica PowerCenter, IBM DataStage, Microsoft SSIS, or open-source options like Apache Airflow.
Proven ability to build and maintain complex data pipelines that integrate data from diverse sources, transform it into usable formats, and load it into target systems.
Deep understanding of data quality and cleansing techniques to ensure the accuracy and consistency of data across the organization.
Data Governance and Quality:
Establish and enforce a comprehensive data governance framework ensuring data integrity, consistency, and security.
Develop and implement data quality standards and processes for continuous data improvement.
Oversee the implementation of master data management and data lineage initiatives.
Collaboration and Mentorship:
Mentor and guide data teams, including architects, engineers, and analysts, on data architecture principles and best practices.
Foster a collaborative environment where data insights are readily shared and acted upon across the organization.
Build strong relationships with business stakeholders to understand and translate their data needs into actionable solutions.
Qualifications:
Education: master’s degree in computer science, Information Systems, or related field; Ph.D. preferred.
Experience: 25+ years of experience in data architecture and design, with 10+ years in a leadership role.
Technical Skills:
Deep understanding of TOGAF, AWS, MDM, EDW, Hadoop ecosystem (MapReduce, Hive, HBase, Pig, Flume, Scoop), cloud data platforms (Azure Synapse, Google Big Query), modern data pipelines, streaming analytics, data governance frameworks.
Proficiency in programming languages (Java, Python, SQL), scripting languages (Bash, Python), data modelling tools (ER diagramming software), and BI tools.
Extensive expertise in ETL tools (Informatica PowerCenter, IBM DataStage, Microsoft SSIS, Apache Airflow)
Familiarity with emerging data technologies (AI/ML, blockchain), data security and compliance frameworks.
Soft Skills:
Outstanding communication, collaboration, and leadership skills.
Strategic thinking and problem-solving abilities with a focus on delivering impactful solutions.
Strong analytical and critical thinking skills.
Ability to influence and inspire teams to achieve goals.


