Codalyze Technologies
http://www.codalyze.comJobs at Codalyze Technologies
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Similar companies
Tech Prescient
About the company
Tech Prescient is a premium Software Product Development Services company, deeply passionate about customer success! Quality and Customer Delight is at the core of everything we do.
What differentiates us is our obsession with the quality of work we produce. Customers work with us because of our technical expertise, transparent and agile processes and unflinching attention to detail in our work. Client success is at the core of what we do.
Our vision is to be the most reliable and trusted development partner for our customers. We are extremely customer-centric, take pride in what we build and are literally obsessed with the quality of work we do for our customers.
We partner with our clients at all stages including ideation, design, development, testing, deployment, and support.
Jobs
6
BizPilot
About the company
Manage Sales, Expenses, Taxation, Payroll, Compliances, Company incorporation and more.
Jobs
1
Fatakpay
About the company
Jobs
9
Kinematic Digital
About the company
Jobs
1
Invensis Technologies Pvt
About the company
Invensis Technologies is a leading IT-BPO service provider has been empowering clients around the world to achieve business transformation and growth through its services that boost process and operational efficiencies. We have been operating since 2000 with an office in Wilmington, DE, USA, and delivery centers in Bangalore, Hyderabad and Rajahmundry, India.
Jobs
0
NeoGenCode Technologies Pvt Ltd
About the company
Jobs
24
JNJ Technologies and Services LLP
About the company
Jobs
68
Clickretinacom
About the company
WE MAKE YOUR MARKETING BETTER
We Are a Team of Professional People in Digital Marketing
ClickRetina is a result-driven digital marketing company that can help businesses in Lucknow find the right marketing solution. The company has been helping businesses in Lucknow for over 8 years, and the results speak for themselves. ClickRetina.com is a Leading Google Partner digital marketing company based in Lucknow, India.
Jobs
1
Brandxzone
About the company
BrandX zone is a leading provider of quality branding services. We specialize in helping start-ups and small businesses create a strong brand presence in the market. Our team of experts is dedicated to delivering innovative solutions that drive growth and success for our clients.
Jobs
1
OwlCraft
About the company
Who We Are:
OwlCraft is a forward-thinking company focused on empowering entrepreneurs by helping them build scalable and innovative startups. With a remote-first approach, we specialize in providing end-to-end startup development services, from ideation to execution.
What We Do:
Our mission is to simplify the process of turning great ideas into successful businesses. We collaborate with visionary entrepreneurs to create impactful solutions tailored to the Indian market and beyond.
Our Philosophy:
At OwlCraft, we believe in a collaborative, flexible, and supportive work environment. By fostering innovation and maintaining a user-centric focus, we aim to deliver solutions that drive long-term success for our clients.
Why Choose Us:
- A team of experienced professionals dedicated to your success.
- Customized and scalable startup solutions.
- Strong emphasis on creating innovative and practical business strategies.
Our Vision:
To become a trusted partner for aspiring entrepreneurs, enabling them to craft startups that make a difference in their industries and communities.
Core Values:
- Innovation
- Collaboration
- Transparency
- Excellence
Jobs
1