
Codalyze Technologies
https://codalyze.comJobs at Codalyze Technologies
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Similar companies
About the company
Jobs
2
About the company
CAW Studios is Product Development Studio. WE BUILD TRUE PRODUCT TEAMS for our clients. Each team is a small, well-balanced group of geeks and a product manager that together produce relevant and high-quality products. We use data to make decisions, bringing big data and analysis to software development. We believe the product development process is broken as most studios operate as IT Services. We operate like a software factory that applies manufacturing principles of product development to the software.
Jobs
13
About the company
]eShipz: Simplifying Global Shipping for Businesses: At eShipz, we are revolutionizing how businesses manage their shipping processes. Our platform is designed to offer seamless multi-carrier integration, enabling businesses of all sizes to ship effortlessly across the globe. Whether you're an e-commerce brand, a manufacturer, or a logistics provider, eShipz helps streamline your supply chain with real-time tracking, automated shipping labels, cost-effective shipping rates, and comprehensive reporting.
Our goal is to empower businesses by simplifying logistics, reducing shipping costs, and improving operational efficiency. With an easy-to-use dashboard and a dedicated support team, eShipz ensures that you focus on scaling your business while we handle your shipping needs.
Jobs
25
About the company
About Pendo
Pendo is a leading product experience and software analytics platform that helps companies understand how users interact with their software and improve those experiences. It operates in the product analytics and digital adoption space, enabling organizations to combine analytics, in-app guidance, and user feedback in one unified platform.
Pendo – Key Highlights
- Founded in 2013, headquartered in Raleigh, North Carolina
- Serves 14,000+ companies globally
- Processes 20B+ daily events and supports 1B+ users
- 850+ employees across global offices
- Raised $350M+ total funding from investors like General Atlantic, Tiger Global, and Sapphire Ventures
Chisel was acquired by Pendo in 2026, marking a key milestone in its journey. The acquisition strengthens Pendo’s push into AI-driven product experience, with Chisel’s agentic capabilities becoming a core part of Pendo’s broader platform vision.
Chisel Labs is an AI-powered product management platform built to help product teams move faster and make better decisions. It operates in the product management and AI SaaS space, bringing feedback, roadmapping, and documentation into a unified system of record.
At its core, Chisel functions as an AI PM Agent, automating workflows like PRDs, research, and feedback analysis - allowing teams to focus on strategy, prioritization, and product outcomes.
About Chisel
Chisel is a lean, globally distributed team with presence across the US and India. The team operates at the intersection of AI, product management, and enterprise SaaS, with a strong emphasis on ownership, speed, and building for real-world product teams at scale. Post-acquisition, the team is now part of Pendo’s broader organization.
🏆 Milestones
- Founded in the early 2020s as a next-gen product management platform
- Built one of the early AI-native PM agents for automating product workflows
- Grew adoption across global teams with integrations like Jira, Salesforce, and Zendesk
- Achieved strong product recognition across PM tooling ecosystems
- Acquired by Pendo (2026) to accelerate AI innovation in product experience
Jobs
2
About the company
TekClan is a professional technology company with a team of experts who are skilled in assisting the clients to maximize the power of innovation towards digital transformation. We provide technology solutions across every vertical including healthcare, education, banking, and finance. We specialize in various independent and end-to-end IT services to cater and support the different needs of our clients; thereby helping them achieve the optimum goal and growth of their business.
Our team aims to provide clients with solutions in major activities such as analytics and mobile systems to provide them a huge head start in transforming their businesses into a Digital Enterprise.
Jobs
2
About the company
The company offers end-to-end software solutions by combining business domain experience, technical expertise, industry knowledge, and quality-driven delivery model. It guides clients through the full cycle of digital product development, from conceptualization to deployment. The company has a strong history of providing software solutions to various industries, including healthcare, finance, retail, and manufacturing. Its use-cases include developing custom software applications, mobile applications, and web applications. The company's solutions help clients solve their business problems by providing them with innovative and efficient software solutions. The company operates in various industries, including healthcare, finance, retail, and manufacturing. It provides software solutions to clients in these industries to help them improve their business processes, increase efficiency, and reduce costs. The company's solutions are tailored to meet the specific needs of each client, ensuring that they receive the best possible software solution.
Jobs
4
About the company
Jobs
2
About the company
At Hunarstreet Technologies Pvt Ltd, we specialize in delivering India’s fastest hiring solutions, tailored to meet the unique needs of businesses across various industries. Our mission is to connect companies with exceptional talent, enabling them to achieve their growth and operational goals swiftly and efficiently.
We are able to achieve a success rate of 87% in relevancy of candidates to the job position and 62% success rate in closing positions shared with us.
Jobs
672
About the company
Jobs
1




