

Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
External Skills And Expertise
Must have Skills:
- Scala
- Spark
- SQL (Intermediate to advanced level)
- Spark Streaming
- AWS preferable/Any cloud
- Kafka /Kinesis/Any streaming services
- Object-Oriented Programming
- Hive, ETL/ELT design experience
- CICD experience (ETL pipeline deployment)
Good to Have Skills:
- AWS Certification
- Git/similar version control tool
- Knowledge in CI/CD, Microservices

About Affine
About
Affine is a pioneering consulting firm that specializes in AI-driven enterprise transformation. With a robust focus on Generative AI, we deliver cutting-edge solutions that redefine industry standards.
Our comprehensive capabilities span the entire analytical value chain, from data to insights and transformation. We provide Cloud Advisory and Assessment, Cloud Migration, Data Lake Design and Development, Big Data Powered Advanced Analytics, AI & Deep Learning Solutions, and Deployment Consulting with Architecture Design.
Company social profiles
Similar jobs

Role & Responsibilities
- Develop high-quality, scalable mobile applications for Android platform using Kotlin.
- Collaborate closely with cross-functional teams to define, design, and implement new features.
- Write clean, efficient, and maintainable code, following coding standards and best practices.
- Optimise mobile app performance to ensure a smooth and responsive user experience.
- Conduct code reviews and provide constructive feedback to team members.
- Stay updated with the latest mobile development trends, tools, and technologies.
- Troubleshoot and debug issues to maintain application stability.
- Participate in the full software development lifecycle, from concept to deployment and support.

• C++ , Unix Environment ( Linux/AIX/HP UX), Oracle/MySQL
• Excellent command on OOPS
• Minimum of 3 years (for Mid and Junior) of hands-on work experience in C++, Unix
• Oracle/MySQL
• Hands-on experience of using data structures, STL, Boost libraries, Design patterns
• Exposure to XML or Edifact is desired
• Exposure to XSLT mappings is a plus
• Excellent troubleshooting skills
• Exposure to CppUnit (or similar tools)
Experience range:
• 4 to 8 years of experience
Joining Location:
• Pune, Gandhinagar & Hyderabad (Preferably Pune & Gandhinagar)
We are looking for Complete Fresher candidate for Intern Position Node js who want to make career in IT company and want to learn Live Project working.
Location : Indore, M.P( Work from office only)
Responsibilities :-
1) He or she will be doing backend Development ( database, admin panel and Rest api)
2) He or she will be learning backend logics with node js and Mongo db databse
3) He or she will be working under Senior Guidance and module by module everything will learn from start and scratch on backend part
4) He or She will be eligible for full time roles also if performance is good and recommendation also in another IT company
5) He or she will receive internship certificate , letter of recommendation and placement assistance.
*****************************************************
if interested for internship then Please call and visit office for detail discussion and for free counseling Monday To Saturday 11 Am to 7 P.M
Company name : Logical Soft Tech Pvt Ltd, Indore(M.P)
Contact : - +91-826.982.972.9(HR), +91-786.973.159.5(HR), +91-741.595.091.9(HR), +91-821.025.182.4 (technical Department),
Address: - 2nd floor, 388,PU4, Scheme 54 PU4, Next to Krozzon hotel, Infront of Old Eye Retina Hospital, Vijay Nagar, Indore, M.P
Google Form
https://forms.gle/6HYUGMp3A8WdvmDS9
Referred by Sumit sir
Visual Treat Creator aka Designer
Work Areas:
Convert ideas into visual treats. These visual treats (output) can be in the form of:
- Web Related Banners
- Emailers
- Logo
- Mascot
- Outdoor Media
- Social Media Creatives
- Paid Ads Creatives
- Social Media Covers
- Presentations
- E-Books
- PDFs
Must Have
- Less of a copy cat but more of a cool cat
- Less of Gyan/ talks and more of hand-on experience
- High on power of observation and story visualisation. Someone who sees art/design in everything around
- Understanding of disruptive themes, art work and agency level brand centric manifestations
- Nothing less than global standards while working on any creative/ viral marketing areas
- Known to deliver cool work rather than keeping cool looks only
- Understanding of graphics / design tools at a professional level
- Hand on experience – Photoshop, Corel, Illustrator with brain of his/her own


We are looking for a candidate for Full-stack Developer
- Experience - 3-5 yrs
- CTC to Offer - 10-15 Lacs
- Work Location - Mumbai / Chennai
Technical Skill set :
Front End: React JS / Ionic Framework
Back End: Spring Boot / Java
Database: Relational Database
Code Repository: GIT or similarProject Experience :
- Should have developed minimum 2 projects with the above tech stack end to end with each project duration of minimum of 6 months
- Should have developed REST/JSON APIs with Spring Boot
- Should be able to develop a module independently end to end
- Should be able to design a simple database
- Should be able to understand and clarify client requirements
- Should be able to write FSDs
- Should be able to write unit test cases manually or using tools
- Should have worked with Service-oriented architecture
- Should know about code maintenance, code review, unit testing
Experience - 2 to 6 Years
Work Location - Pune
Datametica is looking for talented SQL engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.
Mandatory Skills:
- Strong in SQL development
- Hands-on at least one scripting language - preferably shell scripting
- Development experience in Data warehouse projects
Opportunities:
- Selected candidates will be provided learning opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume, and KafkaWould get a chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
MLOps Engineer
Required Candidate profile :
- 3+ years’ experience in developing continuous integration and deployment (CI/CD) pipelines (e.g. Jenkins, Github Actions) and bringing ML models to CI/CD pipelines
- Candidate with strong Azure expertise
- Exposure of Productionize the models
- Candidate should have complete knowledge of Azure ecosystem, especially in the area of DE
- Candidate should have prior experience in Design, build, test, and maintain machine learning infrastructure to empower data scientists to rapidly iterate on model development
- Develop continuous integration and deployment (CI/CD) pipelines on top of Azure that includes AzureML, MLflow and Azure Devops
- Proficient knowledge of git, Docker and containers, Kubernetes
- Familiarity with Terraform
- E2E production experience with Azure ML, Azure ML Pipelines
- Experience in Azure ML extension for Azure Devops
- Worked on Model Drift (Concept Drift, Data Drift preferable on Azure ML.)
- Candidate will be part of a cross-functional team that builds and delivers production-ready data science projects. You will work with team members and stakeholders to creatively identify, design, and implement solutions that reduce operational burden, increase reliability and resiliency, ensure disaster recovery and business continuity, enable CI/CD, optimize ML and AI services, and maintain it all in infrastructure as code everything-in-version-control manner.
- Candidate with strong Azure expertise
- Candidate should have complete knowledge of Azure ecosystem, especially in the area of DE
- Candidate should have prior experience in Design, build, test, and maintain machine learning infrastructure to empower data scientists to rapidly iterate on model development
- Develop continuous integration and deployment (CI/CD) pipelines on top of Azure that includes AzureML, MLflow and Azure Devops

- Deep hands-on experience in designing & developing Python based applications
- Hands-on experience building database-backed web applications using Python based frameworks
- Excellent knowledge of Linux and experience developing Python applications that are deployed in Linux environments
- Experience building client-side and server-side API-level integrations in Python
- Experience in containerization and container orchestration systems like Docker, Kubernetes, etc.
- Experience with NoSQL document stores like the Elastic Stack (Elasticsearch, Logstash, Kibana)
- Development experience with modern JavaScript based front end frameworks, especially Vue.js
- Experience in test automation and TDD
- Experience testing interactive applications with unit testing frameworks for the various technology stacks
- Experience in using and managing Git based version control systems - Azure DevOps, GitHub, Bitbucket etc.
- Experience in using project management tools like Jira, Azure DevOps etc.
- Expertise in Cloud based development and deployment using cloud providers like AWS or Azure

