3+ GCS Jobs in India
Apply to 3+ GCS Jobs on CutShort.io. Find your next job, effortlessly. Browse GCS Jobs and apply today!
Review Criteria
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred
- Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Job Specific Criteria
- CV Attachment is mandatory
- How many years of experience you have with Dremio?
- Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
- Are you okay with 3 Days WFO?
- Virtual Interview requires video to be on, are you okay with it?
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or master’s in computer science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Striim (pronounced “stream” with two i’s for integration and intelligence) was founded in 2012 with a simple goal of helping companies make data useful the instant it’s born.
Striim’s enterprise-grade, streaming integration with intelligence platform makes it easy to build continuous, streaming data pipelines – including change data capture (CDC) – to power real-time cloud integration, log correlation, edge processing, and streaming analytics
2 - 5 Years of Experience in any Programming any language (Polyglot Preferred ) & System Operations • Awareness of Devops & Agile Methodologies • Proficient in leveraging CI and CD tools to automate testing and deployment . • Experience in working in an agile and fast paced environment . • Hands on knowledge of at least one cloud platform (AWS / GCP / Azure). • Cloud networking knowledge: should understand VPC, NATs, and routers. • Contributions to open source is a plus. • Good written communication skills are a must. Contributions to technical blogs / whitepapers will be an added advantage.
We are looking for a self motivated and passionate individual, with strong desire to learn and ability to lead. This position is for a Flight Test Engineer, with exposure to building and flying sUAS (RC Multirotors and Fixed wings). See the detailed job description below.
Responsibilities
• Plan and execute flight test plans for new software features, electronics, sensors, and payloads.
• Perform hands-on mechanical and electrical integration of new hardware components on the internal fleet of test vehicles for R&D and testing.
• Troubleshoot and debug any components of a drone in the office or in the field. Maintenance of vehicles – keep the fleet ready for flight tests.
• Participate in defining and validating customer workflows and enhancing User experience.
• Coordinate cross-team efforts among FlytBase engineers to resolve issues identified during flight tests.
• Drive collaboration with FlytBase Developer team, Business Development team and Customer Support team to incorporate customer feedback and feature requests into FlytBase’s product development cycle.
• Learn about the domain and competitors to propose new drone applications, as well as, improvements in existing applications
Experience/Skills
• Experience in flight testing and operating/piloting small UAS and/or RC aircraft (both fixed-wing and multirotor systems).
• Experience in using flight-planning and ground control station software.
•Familiarity with UAV platforms, like, Pixhawk, DJI, Ardupilot and PX4.
•Experience in integrating, operating, and tuning autopilots on a variety of unmanned vehicles.
•Basic knowledge of electrical test equipment (multimeter, oscilloscope) and UAS sensors.
•Ability to work hands-on with electro-mechanical systems including assembly, disassembly, testing, troubleshooting.
•Good verbal and written communication skills.
Good to have
• RF communications fundamentals.
• Passionate about aerial robots i.e. drones.
• Programming languages and scripting for engineering use (C++, C, MATLAB, Python).
Compensation:
As per industry standards.
Perks:
+ Fast-paced Startup culture
+ Hacker mode environment
+ Great team
+ Flexible work hours
+ Informal dress code
+ Free snacks


