
Codalyze Technologies
https://codalyze.comJobs at Codalyze Technologies
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Similar companies
About the company
Jobs
4
About the company
Juntrax Solutions is a young company with a collaborative work culture, on a mission to bring efficient solutions to SMEs. Joining us you will be part of a great team building an integrated platform for SMEs to help them manage their daily business globally.
Jobs
3
About the company
Deep Tech Startup Focusing on Autonomy and Intelligence for Unmanned Systems. Guidance and Navigation, AI-ML, Computer Vision, Information Fusion, LLMs, Generative AI, Remote Sensing
Jobs
4
About the company
Baker Street Fintech (Product Name: Cambridge Wealth) is a Financial Products Company. We help build world-class Fintech Products for our Clients who want to manage their wealth on our platform. Founded by professionals with Experiences spanning from PwC UK to Banking and Technology firms, we are a financially stable, profitable company growing quickly!
Jobs
1
About the company
Quantiphi is an award-winning AI-first digital engineering company driven by the desire to reimagine and realize transformational opportunities at the heart of the business. Since its inception in 2013, Quantiphi has solved the toughest and most complex business problems by combining deep industry experience, disciplined cloud, and data-engineering practices, and cutting-edge artificial intelligence research to achieve accelerated and quantifiable business results.
Jobs
4
About the company
enParadigm is one of the world's leading experiential learning and talent intelligence companies. We leverage Generative AI & Immersive AI solutions to create hyper-personalised, immersive experiences, driving business impact and behavioural change across levels and functions.
We have been recognized among the fastest growing tech companies in APAC by Deloitte as part of the Deloitte Tech Fast 500 APAC program. We leverage our proprietary simulations, and a rigorous sustained-learning approach. Learn more about our work. We have worked with 500+ organisations around the world such as Coca-Cola, Infosys, P&G, Societe Generale, Colgate-Palmolive, WNS, Citibank, etc, to help drive growth and leadership.
Jobs
2
About the company
At TheBlueOwls, we are passionate about harnessing the power of data analytics and artificial intelligence to transform businesses and drive innovation. With a team of experts and cutting-edge technology, we help our clients unlock valuable insights from their data and leverage AI solutions to stay ahead in today's competitive landscape.
Our Founder
Our company was founded by Puran Ticku, an ex-Microsoft Architect with over 20+ years of experience in the field of data and digital health. Puran Ticku has a deep understanding of the potential of data analytics and AI and has successfully led transformative solutions for numerous organizations.
Our Expertise
We specialize in providing comprehensive data analytics services, helping businesses make data-driven decisions and uncover hidden patterns and trends. With our advanced AI capabilities, we enable our clients to automate processes, enhance productivity, and gain a competitive edge in their industries.
Our Approach
At TheBlueOwls, we believe that the key to successful data analytics and AI implementation lies in a holistic approach. We work closely with our clients to understand their unique challenges and goals, and tailor our solutions to meet their specific needs. Our team of skilled professionals utilizes state-of-the-art technology and industry best practices to deliver exceptional results.
Our Commitment
We are committed to delivering the highest level of quality and value to our clients. We strive for excellence in every project we undertake, ensuring that our solutions are not only effective, but also scalable and sustainable. With our deep domain expertise and customer-centric approach, we are dedicated to driving success for our clients and helping them achieve their business objectives.
Contact us today to learn more about how TheBlueOwls can empower your organization with data analytics and AI solutions that drive growth and innovation.
Jobs
5
About the company
Jobs
1
About the company
CipherSonic Labs offers a game-changing cloud-based cybersecurity solution designed for enterprises to securely share and collaboratively process data with other enterprises. Our fully homomorphic encryption-based approach ensures that data remains encrypted throughout its entire lifecycle—during processing, while in transit and while in storage—providing robust data privacy and security guarantees. With our solution, enterprises can tap into the $100 billion data collaboration market, enabling them to collaborate effectively, extract novel insights from shared data, and enhance customer services.
Jobs
2
About the company
Rupayya (rupayya.com) is a fintech startup building a next-generation social Lending platform that enables secure and digital lending among trusted connections. Our core mission is to redefine how people access credit by leveraging social trust, data intelligence, and modern financial infrastructure.At the heart of Rupayya lies the concept of community-driven lending, where users can create lending circles or “pods” and transact within known networks. By combining peer-to-peer (P2P) lending models with a social credit graph, Rupayya enables more reliable and transparent credit decisions while reducing default risks.We are building a social credit infrastructure that uses behavioral data, relationships, and transaction patterns to enhance creditworthiness assessment beyond traditional financial metrics. This allows us to serve users who are often underserved by conventional banking systems
Jobs
1







