

Be Part Of Building The Future
Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems.
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 8+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team

About Dremio
About
Connect with the team
Similar jobs
- Strong proficiency in Java programming language.
- Experience with Java frameworks like Spring and Spring Boot.
- Understanding of RESTful APIs and web services.
- Experience with databases and data storage technologies (e.g., SQL, NoSQL).
- Knowledge of software development best practices, including testing and code quality.
- Experience with version control systems (e.g., Git).
- Familiarity with cloud platforms (e.g., AWS, Azure, GCP).
- Strong problem-solving and debugging skills.
- Excellent communication and collaboration skills.

We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
Experience/skills required
Key Qualifications
- Developer Role:
- Preferably Java
- CI/CD - DevOps (Exposure)
- Messaging Middleware (Exposure to Kafka or any other messaging middleware)
- DB: Oracle (pref) - any other Database platforms (SQL/NoSQL)
- Server Side: Java, Spring boot Microservices
- Exposure to Any Major Cloud Platform (AWS/Azure/GCP)
About the Company:
Alyke, recognized as India's first friendship app, is revolutionizing the way people find friends online through its innovative matching algorithm. Alyke has quickly gained traction, surpassing 1 million users. This platform uniquely connects individuals based on shared interests and proximity, and stands out for its commitment to creating a secure environment that encourages users to express themselves freely. Joining Alyke offers the chance to be part of an innovative team dedicated to reshaping the landscape of social connections
Role Overview:
As a Senior Backend Developer, you will play a crucial role in the design, development, and optimization of our backend systems. You will be instrumental in building scalable, high-performance applications that support our business's needs. This role demands a deep understanding of backend technologies, database design, cloud infrastructure, and the ability to integrate a wide range of services and APIs.
Key Responsibilities
- Design and implement scalable, secure, and robust backend services and APIs.
- Lead the development of microservices architecture, ensuring optimal performance and scalability.
- Oversee the integration of third-party services and external APIs, including notification services like FCM.
- Develop and manage database schemas, optimizing for performance and scalability using MongoDB and Redis.
- Implement real-time data processing mechanisms for live updates and push notifications, utilizing technologies like Firebase.
- Manage cloud infrastructure and serverless functions on platforms such as AWS and GCP, ensuring efficient operations and cost management.
- Ensure the security of backend systems through robust authentication, authorization, data encryption, and adherence to best practices in data security and compliance.
- Work closely with the frontend team to integrate backend services and ensure a seamless user experience.
- Continuously evaluate and adopt new technologies and frameworks to improve the backend infrastructure.
- Mentor junior developers, fostering a culture of learning and growth within the team.
Qualifications
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in backend development, with a strong portfolio of projects demonstrating expertise in NodeJS, ExpressJS, and asynchronous programming.
- Extensive experience with database design, optimization, and management using MongoDB and Redis.
- Proven track record in developing and managing REST APIs.
- Deep understanding of microservices architecture and real-time data processing.
- Solid experience with caching mechanisms and SQS management.
- Comprehensive knowledge of security principles, including authentication, authorization, and data encryption.
- Familiarity with compliance and data protection standards.
- Should have worked on scalable microservices architecture
- Experience with Lambda functions.
- Experience working with firebase
- Should have proven experience in mobile application development
- Should have working experience in sqs
- Have knowledge of elastic cache
- Experience with cloud-native development and managing cloud infrastructure on AWS or GCP.
- Excellent problem-solving skills, with the ability to lead projects and collaborate effectively with cross-functional teams.
- Strong communication and leadership skills, with a passion for mentoring and driving technical excellence.
- Attach the link of mobile application worked on till date
Why Join Us?
- Opportunity to lead a cutting-edge project impacting users globally.
- Work in an environment that fosters innovation, collaboration, and professional growth.
- Competitive compensation, comprehensive benefits.

Location: Faridabad (WORK FROM OFFICE)
Qualification & Eligibility:
graduate
CGPA no bar
Working Experience:
1-5 years
Roles & Responsibilities:
Writing clean, fast PHP code/program to a high standard, in a timely and scalable way.
Create & Implement an array of Web-based products using PHP, MySQL, Ajax, and JavaScript.
Develop back-end components, connect the application with other web services and assist front-end developers by ensuring their work integrates with the application.
Strong core PHP hands-on experience.
Strong Expertise in CodeIgniter Framework.
Good Knowledge of PHP8, MySQL/PostgreSQL, Bootstrap, jQuery, Javascript, HTML5,CSS3, JSON.
Good knowledge of Database Designing.
salary no bar for deserving candidates

Striim (pronounced “stream” with two i’s for integration and intelligence) was founded in 2012 with a simple goal of helping companies make data useful the instant it’s born.
Striim’s enterprise-grade, streaming integration with intelligence platform makes it easy to build continuous, streaming data pipelines – including change data capture (CDC) – to power real-time cloud integration, log correlation, edge processing, and streaming analytics.
Strong Core Java / C++ experience
· Excellent understanding of Logical ,Object-oriented design patterns, algorithms and data structures.
· Sound knowledge of application access methods including authentication mechanisms, API quota limits, as well as different endpoint REST, Java etc
· Strong exp in databases - not just a SQL Programmer but with knowledge of DB internals
· Sound knowledge of Cloud database available as service is plus (RDS, CloudSQL, Google BigQuery, Snowflake )
· Experience working in any cloud environment and microservices based architecture utilizing GCP, Kubernetes, Docker, CircleCI, Azure or similar technologies
· Experience in Application verticals such as ERP, CRM, Sales with applications such as Salesforce, Workday, SAP < Not Mandatory - added advantage >
· Experience in building distributed systems < Not Mandatory - added advantage >
· Expertise on Data warehouse < Not Mandatory - added advantage >
· Exp in developing & delivering product as SaaS i< Not Mandatory - added advantage
Digit88 is looking for an enthusiastic, self-motivated, hands-on Java/J2EE platform engineer to join the
back-end platform engineering team for our partner. Experience with a fast-paced India/US product
start-up or a product engineering services company in a developer role, building a high-performance
real-time system is mandatory. Applicants having experience in developing and maintaining large scale
messaging platforms are preferred. Applicants must have a passion for engineering with accuracy and
efficiency, be highly motivated and organized, able to work as part of a team, and also possess the ability
to work independently with minimal supervision.
To be successful in this role, you should possess
● Bachelor's degree in Computer Science or a related field with 7-9 years hands-on experience with
Java based open source tech stack
● Expertise in Core Java, Data Structures, J2EE with proven expertise in Spring MVC, Spring boot,
Microservices architecture, Web Services (Rest) in distributed systems
● Practical experience with MySQLand/or NoSQL databases like Couchbase, DynamoDB,
Cassandra
● Practical experience with Caching frameworks Memcached/Redis, Message Queues (JMS,
RabbitMQ)
● Practical hands-on experience in JavaScript and NodeJS
● Experience in building high performance, high availability REST APIs and REST clients
● Expertise with log file analysis using one or more of ELK, Splunk, Kibana
● Prior experience with CI/CD, Container architecture - Docker/Jenkins and build scripts Maven/Ant
● Experience with Kubernetes
● Prior experience in transformation to cloud platforms is preferred
● Experience with Kafka is a definite plus
● Experience with building analytics pipeline and analytics DB is a plus
● Strong practical experience in applying design patterns, multithreading concepts to solve complex
problems, strong problem solving skills
You are someone who would easily be able to
● Study and learn the latest in AI/NLP/Chatbots domain and the messaging platform
● Work closely with the US and India engineering teams to help build the Java/Spring based backend
and REST APIs.
● Closely collaborate with the principal engineer in the India engineering team in technical excellence
and ownership of critical modules; own the development of new modules and features
● Troubleshoot live production server issues; assume leadership responsibilities in the production
issue resolution lifecycle
● Handle client coordination and be able to work as a part of a team, be able to contribute
independently and drive the team to exceptional contributions with minimal team supervision
● Perform Unit Testing and Integration testing in a Continuous Deployment scenario
● Follow Agile methodology, JIRA for work planning, issue management/tracking
o Minimum 8 years of overall experience in software development.
o Experience as a Lead developer .
o Experience with AWS,Architecture, Node js.


