
Codalyze Technologies
https://codalyze.comJobs at Codalyze Technologies
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Similar companies
About the company
bipp is a business intelligence (BI) company that helps organizations use their data to make better and faster decisions. Our enterprise-grade, in-database BI platform was developed to save data and BI analysts’ time and develop insights faster. It leverages SQL and is powered by the bippLang data modeling language, which supports collaboration, git-based version control, and re-usable data models.
A company created by engineers, an environment that strives to keep stress levels low, understands an engineer's mindset, and trusts its engineers enough to be an all-remote company since the first day it started in 2017 !
Jobs
1
About the company
Wishup is India’s largest remote work platform (since 2017), connecting global businesses with top remote professionals in roles such as Virtual Assistants, Operations/Admin Managers, Executive Assistants, Project Managers, Bookkeepers, and Accountants. With a stringent 0.1% acceptance rate, each professional is upskilled and managed via our AI-based remote work tool.
Backed by marquee investors (Orios Ventures, Inflection Point Ventures, 500 Startups, and Tracxn Labs), Wishup’s leadership team includes alumni from premier institutes like IIT Madras, IIM Ahmedabad, IIT Kanpur, and DCE.
Jobs
3
About the company
Steady Rabbit is a premium IT services provider with its head office in India and offshore sales offices in the US and Italy. We have created successful products like OneAD - It got 30 million subscribers all over India. Another product of the organization is OneMALL creating ripples in the social commerce space.
We believe in solving real-life problems - solutions that touch the lives of billions of humans, and hence innovation and innovative minds form the nucleus of our firm. We work primarily on the MEAN stack with capability in both web and mobile solutions.
Our key IT Development services include Product Engineering, Mobility, DevOps, Analytics, Product re-engineering, and Testing. We offer both Web and Mobile App development as well as custom software development. We are best suited for Small and Medium Enterprise clients including Startups who are looking for timely delivery within specified budgets.
Technical Expertise -
- Frontend & UX - React js, Android, IOS, React Native, Angular JS
- Backend - Python, Node Js, PHP, Java
- Analytics - Panda , R , Hadoop , Hive
- Cloud & Devops - AWS , Jenkins , Docker , Kubernetes
- Database - MYSQL , PostGres , MongoDB , DynamoDB
Some of the key values that we as an organisation believe in are:
- Individual Respect
- Crazy Growth
- Effective and profitable solutions for our clients
- Process orientation
- Work From Home
- Flexible Timings
We invite and welcome you to explore the working opportunities with us.
Jobs
5
About the company
Welcome to Neogencode Technologies, an IT services and consulting firm that provides innovative solutions to help businesses achieve their goals. Our team of experienced professionals is committed to providing tailored services to meet the specific needs of each client. Our comprehensive range of services includes software development, web design and development, mobile app development, cloud computing, cybersecurity, digital marketing, and skilled resource acquisition. We specialize in helping our clients find the right skilled resources to meet their unique business needs. At Neogencode Technologies, we prioritize communication and collaboration with our clients, striving to understand their unique challenges and provide customized solutions that exceed their expectations. We value long-term partnerships with our clients and are committed to delivering exceptional service at every stage of the engagement. Whether you are a small business looking to improve your processes or a large enterprise seeking to stay ahead of the competition, Neogencode Technologies has the expertise and experience to help you succeed. Contact us today to learn more about how we can support your business growth and provide skilled resources to meet your business needs.
Jobs
383
About the company
Jobs
8
About the company
We help companies of all sizes from Startups to Unicorns in building and deploying scalable, future-ready solutions.
Jobs
1
About the company
Kitchen Spurs is a tech enabled, AI powered managed services company focused on delivery marketplaces. We exist to solve a simple but critical problem. Restaurant operators should not have to be data analysts while running their business.
In an ever evolving landscape with inherently thin margins, we believe every decision must be rooted in data and guided by delivery P&L first thinking. From high level growth strategy to the smallest operational detail, our approach is built on data first thinking applied across every decision.
We work closely with restaurant teams to simplify and navigate the complexity of marketplace algorithms, translating data into clear actions that improve performance, profitability, and long term growth.
Jobs
2
About the company
Jobs
11
About the company
Jobs
7
About the company
Get accurate daily weather forecasts, real-time updates, and advanced weather data
Jobs
2






