
Codalyze Technologies
https://codalyze.comJobs at Codalyze Technologies
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Similar companies
About the company
Founded in 2016, HyperWorks Imaging has more than 40 years of collective research experience in AI/ML, materials, energy, experimentation and modeling. Our data scientists and software engineers have deployed cutting edge machine learning solutions for leading clean energy companies to improve their R&D and manufacturing processes.
Our core mission is to reduce the time to market for much needed technologies via advanced AI software and hardware platforms. We boldly go where no one has gone before. We tackle massive, intellectually stimulating challenges that others deem impossible, and we're looking for pioneers who want to build something that truly matters. If you're driven by curiosity, thrive on autonomy, and want your work to have a seismic impact, you've found your home.
Jobs
1
About the company
Who are we?
Trendlyne is a funded, profitable products startup in the financial markets space. We have cutting-edge analytics products built for Indian and US customers, for stock markets and mutual funds.
Our founders are IIT + IIM graduates, with strong tech and marketing experience. We have top finance and management experts on the Board of Directors.
What do we do?
We build best in class analytics in the US and Indian stock market space. Organic growth in B2B and B2C products have already made the company profitable. We deliver 1 billion+ APIs every month to B2B customers, and have a B2C website and app.
Visit us at trendlyne.com, or look for the Trendlyne mobile app on the Google Play Store:
https://play.google.com/store/apps/details?id=com.trendlyne.markets
We are a great place to work
We have a culture where you are building something awesome, and your work makes a difference. Full-time employees get paid leave, parental leave, medical insurance, and employee stock options.
We invest in your learning and check in with you to help you meet your career goals. We keep regular hours and don't work on weekends.
Jobs
4
About the company
We’re a UI/UX design company, super-powering businesses by crafting simple & delightful digital experiences.
We are designers, artists, creators, researchers, visualizers and observers; well a bunch of driven individuals with creative minds, working together as User Interface and User Experience Designers!
At Monsoonfish, we believe in working in an environment that suits each teammate, makes them feel comfortable and encourages them to become a better version of themselves at work and beyond. Our agency culture is open, liberal, accepting, outgoing, driven, focused, and the one that values work-life balance.
Jobs
4
About the company
We help companies of all sizes from Startups to Unicorns in building and deploying scalable, future-ready solutions.
Jobs
1
About the company
Jobs
13
About the company
Stairio is a digital infrastructure company building scalable online systems for modern businesses.
We help service-driven brands establish strong digital foundations through high-performance websites, booking systems, management dashboards, and integrated payment solutions. Our goal is to give businesses ownership, control, and long-term digital assets that generate measurable revenue.
Jobs
2
About the company
At Ampera Technologies, we empower businesses with cutting-edge data analytics, quality assurance, and data engineering solutions
Jobs
14
About the company
Apply for internship programs in 2026 at LetsIntern. Work on real industry projects, gain certificates, stipends & job-ready skills.
Jobs
15
About the company
LetsIntern is a trusted student internship platform in India dedicated to bridging the gap between academic learning and real-world industry experience. We provide training cum internship programs designed for students who want practical skills, hands-on project exposure, and career-ready confidence.
Jobs
9
About the company
Outpilot is an AI-powered outreach engine that identifies decision-makers and produces human-grade outreach at scale for sponsorship and complex B2B.
Jobs
1





