

Bits In Glass
https://bitsinglass.comAbout
Bits In Glass (BIG) is an award-winning software consulting firm that helps organizations improve operations and drive better customer experiences. They specialize in business process automation consulting, helping clients unlock the potential of their people, processes, and data.
Tech stack
Candid answers by the company
Bits In Glass (BIG) is an award-winning software consulting firm established in 2002 that specializes in business process automation. The company helps organizations improve their operations and customer experiences by implementing and managing automation solutions using industry-leading platforms like Appian, Pega, MuleSoft, and Blue Prism. Their primary focus is on helping clients across financial services, insurance, logistics, real estate, and healthcare sectors to modernize their operations by unlocking the potential of their people, processes, and data through technological solutions. As a global consulting firm with 500+ employees, they work with market leaders to drive digital transformation and provide innovative solutions for complex business challenges.
Jobs at Bits In Glass
Responsibilities
- Act as a liaison between business and technical teams to bridge gaps and support successful project delivery.
- Maintain high-quality metadata and data artifacts that are accurate, complete, consistent, unambiguous, reliable, accessible, traceable, and valid.
- Create and deliver high-quality data models while adhering to defined data governance practices and standards.
- Translate high-level functional or business data requirements into technical solutions, including database design and data mapping.
- Participate in requirement-gathering activities, elicitation, gap analysis, data analysis, effort estimation, and review processes.
Qualifications
- 8–12 years of strong data analysis and/or data modeling experience.
- Strong individual contributor with solid understanding of SDLC and Agile methodologies.
- Comprehensive expertise in conceptual, logical, and physical data modeling.
Skills
- Strong financial domain knowledge and data analysis capabilities.
- Excellent communication and stakeholder management skills.
- Ability to work effectively in a fast-paced and continuously evolving environment.
- Problem-solving mindset with a solution-oriented approach.
- Team player with a self-starter attitude and strong sense of ownership.
- Proficiency in SQL, MS Office tools, GCP BigQuery, Erwin, and Visual Paradigm (preferred).
We are seeking a highly skilled Senior Data Engineer with expertise in Databricks, Python, Scala, Azure Synapse, and Azure Data Factory to join our data engineering team. The team is responsible for ingesting data from multiple sources, making it accessible to internal stakeholders, and enabling seamless data exchange across internal and external systems.
You will play a key role in enhancing and scaling our Enterprise Data Platform (EDP) hosted on Azure and built using modern technologies such as Databricks, Synapse, Azure Data Factory (ADF), ADLS Gen2, Azure DevOps, and CI/CD pipelines.
Responsibilities
- Design, develop, optimize, and maintain scalable data architectures and pipelines aligned with ETL principles and business goals.
- Collaborate across teams to build simple, functional, and scalable data solutions.
- Troubleshoot and resolve complex data issues to support business insights and organizational objectives.
- Build and maintain data products to support company-wide usage.
- Advise, mentor, and coach data and analytics professionals on standards and best practices.
- Promote reusability, scalability, operational efficiency, and knowledge-sharing within the team.
- Develop comprehensive documentation for data engineering standards, processes, and capabilities.
- Participate in design and code reviews.
- Partner with business analysts and solution architects on enterprise-level technical architectures.
- Write high-quality, efficient, and maintainable code.
Technical Qualifications
- 5–8 years of progressive data engineering experience.
- Strong expertise in Databricks, Python, Scala, and Microsoft Azure services including Synapse & Azure Data Factory (ADF).
- Hands-on experience with data pipelines across multiple source & target systems (Databricks, Synapse, SQL Server, Data Lake, SQL/NoSQL sources, and file-based systems).
- Experience with design patterns, code refactoring, CI/CD, and building scalable data applications.
- Experience developing batch ETL pipelines; real-time streaming experience is a plus.
- Solid understanding of data warehousing, ETL, dimensional modeling, data governance, and handling both structured and unstructured data.
- Deep understanding of Synapse and SQL Server, including T-SQL and stored procedures.
- Proven experience working effectively with cross-functional teams in dynamic environments.
- Experience extracting, processing, and analyzing large / complex datasets.
- Strong background in root cause analysis for data and process issues.
- Advanced SQL proficiency and working knowledge of a variety of database technologies.
- Knowledge of Boomi is an added advantage.
Core Skills & Competencies
- Excellent analytical and problem-solving abilities.
- Strong communication and cross-team collaboration skills.
- Self-driven with the ability to make decisions independently.
- Innovative mindset and passion for building quality data solutions.
- Ability to understand operational systems, identify gaps, and propose improvements.
- Experience with large-scale data ingestion and engineering.
- Knowledge of CI/CD pipelines (preferred).
- Understanding of Python and parallel processing frameworks (MapReduce, Spark, Scala).
- Familiarity with Agile development methodologies.
Education
- Bachelor’s degree in Computer Science, Information Technology, MIS, or an equivalent field.
As a Data Engineer, you will be an integral part of our team, working on data pipelines, data warehousing, and data integration for various analytics and AI use cases. You will collaborate closely with Delivery Managers, ML Engineers and other stakeholders to ensure seamless data flow and accessibility. Your expertise will be crucial in enabling data-driven decision-making for our clients. To thrive in this role, you need to be a quick learner, get excited about innovation and be on the constant lookout to master new technologies as they come up in the Data, AI & Cloud teams.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes to support downstream analytics and AI applications.
- Collaborate with ML Engineers to integrate data solutions into machine learning models and workflows.
- Work closely with clients to understand their data requirements and deliver tailored data solutions.
- Ensure data quality, integrity, and security across all projects.
- Optimize and manage data storage solutions in cloud environments (AWS, Azure, GCP).
- Utilize Databricks for data processing and analytics tasks, leveraging its capabilities to enhance data workflows.
- Monitor the performance of data pipelines, identify bottlenecks or failures, and implement improvements to enhance efficiency and reliability.
- Implement best practices for data engineering, including documentation, testing, and version control.
- Troubleshoot and resolve data-related issues in a timely manner.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- 3 to 5 years of experience as a Data Engineer or in a similar role.
- Strong proficiency in SQL, Python, and other relevant programming languages.
- Hands-on experience with Databricks and its ecosystem.
- Familiarity with major cloud environments (AWS, Azure, GCP) and their data services.
- Experience with data warehousing solutions like Snowflake, Redshift, or BigQuery.
- Comfortable working with a variety of SQL, NoSQL and graph databases like PostgreSQL and MongoDB;
- Knowledge of data integration tools.
- Understanding of data modelling, data architecture, and database design.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Highly Desirable Skills
- Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming).
- Knowledge of data visualisation tools (e.g., Tableau, Power BI).
- Familiarity with machine learning concepts and frameworks.
- Experience working in a client-facing role.
Similar companies
About the company
Appknox, a leading mobile app security solution HQ in Singapore & Bangalore was founded by Harshit Agarwal and Subho Halder.
Since its inception, Appknox has become one of the go-to security solutions with the most powerful plug-and-play security platform, enabling security researchers, developers, and enterprises to build safe and secure mobile ecosystems using a system-plus human approach.
Appknox offers VA+PT solutions ( Vulnerability Assessment + Penetration Testing ) that provide end-to-end mobile application security and testing strategies to Fortune 500, SMB and Large Enterprises Globally helping businesses and mobile developers make their mobile apps more secure, thus not only enhancing protection for their customers but also for their own brand.
During the course of 9 years, Appknox has scaled up to work with some major brands in India, South-East Asia, Middle-East, Japan, and the US and has also successfully enabled some of the top government agencies with its On-Premise deployments & compliance testing. Appknox helps 500+ Enterprises which includes 20+ Fortune 1000 and ministries/regulators across 10+ countries and some of the top banks across 20+ countries.
A champion of Value SaaS, with its customer and security-first approach Appknox has won many awards and recognitions from G2, and Gartner and is one of the top mobile app security vendors in its 2021 Application security Hype Cycle report.
Our forward-leaning, pioneering spirit is backed by SeedPlus, JFDI Asia, Microsoft Ventures, and Cisco Launchpad and a legacy of expertise that began at the dawn of 2014.
Jobs
4
About the company
Jobs
1
About the company
Jobs
4
About the company
Who we are
We are Software Craftspeople. We are proud of the way we work and the code we write. We embrace and are evangelists of eXtreme Programming practices. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus own quality. And most importantly, we never stop learning!
We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. We work with large established institutions to help them create internal applications to automate manual opperations and achieve scale.
We design software, design the team a well as the organizational strategy required to successfully release robust and scalable products. Incubyte strives to find people who are passionate about coding, learning and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim to bringing a product mindset into services. More on our website: https://www.incubyte.co/
Join our team! We’re always looking for like minded people!
Jobs
11
About the company
Jobs
6
About the company
Jobs
26
About the company
Jobs
12
About the company
Jobs
5
About the company
Jobs
1
About the company
At Hello Trade, an IndiaMART company, we specialize in offering the most suitable business loans to meet all your business needs. We provide options like Term Loans and Overdraft Facilities under Collateral-Free Business Loan and a Loan Against Property under secured business loans to fuel your business growth. Our commitment is to empower your business with the most suitable business loans at competitive rates. Let's work together to scale your business to new heights.
Jobs
2





