
DATA ENGINEERING CONSULTANT
About NutaNXT: NutaNXT is a next-gen Software Product Engineering services provider building ground-breaking products using AI/ML, Data Analytics, IOT, Cloud & new emerging technologies disrupting the global markets. Our mission is to help clients leverage our specialized Digital Product Engineering capabilities on Data Engineering, AI Automations, Software Full stack solutions and services to build best-in-class products and stay ahead of the curve. You will get a chance to work on multiple projects critical to NutaNXT needs with opportunities to learn, develop new skills,switch teams and projects as you and our fast-paced business grow and evolve. Location: Pune Experience: 6 to 8 years
Job Description: NutaNXT is looking for supporting the planning and implementation of data design services, providing sizing and configuration assistance and performing needs assessments. Delivery of architectures for transformations and modernizations of enterprise data solutions using Azure cloud data technologies. As a Data Engineering Consultant, you will collect, aggregate, store, and reconcile data in support of Customer's business decisions. You will design and build data pipelines, data streams, data service APIs, data generators and other end-user information portals and insight tools.
Mandatory Skills: -
- Demonstrable experience in enterprise level data platforms involving implementation of end-to-end data pipelines with Python or Scala - Hands-on experience with at least one of the leading public cloud data platforms (Ideally Azure)
- - Experience with different Databases (like column-oriented database, NoSQL database, RDBMS)
- - Experience in architecting data pipelines and solutions for both streaming and batch integrations using tools/frameworks like Azure Databricks, Azure Data Factory, Spark, Spark Streaming, etc
- . - Understanding of data modeling, warehouse design and fact/dimension concepts - Good Communication
Good To Have:
Certifications for any of the cloud services (Ideally Azure)
• Experience working with code repositories and continuous integration • Understanding of development and project methodologies
Why Join Us?
We offer Innovative work in AI & Data Engineering Space, with a unique, diverse workplace environment having a Continuous learning and development opportunities. These are just some of the reasons we're consistently being recognized as one of the best companies to work for, and why our people choose to grow careers at NutaNXT. We also offer a highly flexible, self-driven, remote work culture which fosters the best of innovation, creativity and work-life balance, market industry-leading compensation which we believe help us consistently deliver to our clients and grow in the highly competitive, fast evolving Digital Engineering space with a strong focus on building advanced software products for clients in the US, Europe and APAC regions.

Similar jobs
Job Title : Senior Backend Developer (Node.js + AWS + MongoDB)
Experience : 4+ Years
Location : Andheri, Mumbai (Work From Office)
About the Role :
We are looking for a highly skilled Senior Backend Developer with strong expertise in Node.js (NestJS), AWS, and MongoDB to join our growing engineering team.
This role requires someone who takes ownership, is proactive, and enjoys building scalable, high-performance backend systems in a fast-paced environment.
Key Responsibilities :
- Architect, design, and develop scalable backend services using Node.js (NestJS).
- Design and manage cloud infrastructure on AWS Services (EC2, ECS, RDS, Lambda, etc.).
- Develop and maintain high-performance database solutions using MongoDB.
- Work with Kafka, Docker, and serverless frameworks (SST) for efficient deployments.
- Optimize system performance, scalability, and reliability across services.
- Ensure application security, best practices, and compliance standards.
- Collaborate with cross-functional teams to deliver robust product features.
- Take end-to-end ownership of features from design to deployment.
Technical Requirements :
- 4+ years of backend development experience.
- 3+ years of hands-on experience with Node.js.
- 2+ years of hands-on experience with AWS.
- Strong experience with NestJS framework.
- Solid experience with MongoDB and database design.
- Experience with Kafka, Docker, and serverless architecture.
- Understanding of system design, scalability, and performance optimization.
Good to Have (Bonus Skills) :
- Experience with Python or other backend languages.
- Exposure to Agentic AI use cases or implementations.
- Strong understanding of security best practices.
What We’re Looking For :
- Curious mindset and eagerness to learn new technologies.
- Proactive problem solver with strong ownership attitude.
- Strong team player with effective communication skills.
- Positive, energetic, and passionate about building great systems.
- Strong Software Engineering Profile
- Mandatory (Experience 1): Must have 5+ years of experience using Python to design software solutions.
- Mandatory (Skills 1): Strong working experience with Python (with Django framework experience) and Microservices architecture is a must.
- Mandatory (Skills 2): Must have experience with event-driven architectures using Kafka
- Mandatory (Skills 3): Must have Experience in DevOps practices and container orchestration using Kubernetes, along with cloud platforms like AWS, GCP, or Azure
- Mandatory (Company): Product companies, Experience working in fintech, banking, or product companies is a plus.
- Mandatory (Education): Educational background must be from a premium institute (IIT, IIIT, NIT, MNNIT, VITS, BITS)
Must-Have Skills:
- Hands-on experience with airgap Kubernetes clusters, ideally in regulated industries (finance, healthcare, etc.).
- Strong expertise in CI/CD pipelines, programmable infrastructure, and automation.
- Proficiency in Linux troubleshooting, observability (Prometheus, Grafana, ELK), and multi-region disaster recovery.
- Security & compliance knowledge for regulated industries.
- Preferred: Experience with GKE, RKE, Rook-Ceph and certifications like CKA, CKAD.
Who You Are
- A Kubernetes expert who thrives on scalability, automation, and security.
- Passionate about optimizing infrastructure, CI/CD, and high-availability systems.
- Comfortable troubleshooting Linux, improving observability, and ensuring disaster recovery readiness.
- A problem solver who simplifies complexity and drives cloud-native adoption.
What You’ll Do
- Architect & automate Kubernetes solutions for airgap and multi-region clusters.
- Optimize CI/CD pipelines & cloud-native deployments.
- Work with open-source projects, selecting the right tools for the job.
- Educate & guide teams on modern cloud-native infrastructure best practices.
- Solve real-world scaling, security, and infrastructure automation challenges.
Why Join Us?
- Work on high-impact Kubernetes projects in regulated industries.
- Solve real-world automation & infrastructure challenges with cutting-edge tools.
- Grow in a team that values learning, open-source contributions, and innovation.
Blue Owls Solutions is looking for a mid-level Azure Data Engineer with approximately 4 years of hands-on experience to join our growing data team. In this role, you will design, build, and maintain scalable data pipelines and architectures that power business-critical analytics and reporting. You'll work closely with cross-functional teams to transform raw data into reliable, high-quality datasets that drive decision-making across the organization.
Required Skills
- 4+ years of professional experience as a Data Engineer or in a similar data-focused role
- Strong proficiency in SQL for data manipulation, querying, and performance optimization
- Hands-on experience with PySpark for large-scale data processing and transformation
- Solid working knowledge of the Microsoft Azure ecosystem (Azure Data Factory, Azure Data Lake, Azure Synapse, etc.)
- Experience with Microsoft Fabric for end-to-end data analytics workflows
- Ability to design and implement robust data architectures including data warehouses, lakehouses, and ETL/ELT frameworks
- Strong coding and scripting skills with Python
- Proven problem-solving ability with a knack for debugging complex data issues and optimizing pipeline performance
- Understanding of data modeling concepts, dimensional modeling, and data governance best practices
Interview Process
- Take-Home Assessment
- 60-Minute Technical Interview
- Culture Fit Round
Preferred Skills & Certifications
- Microsoft Certified: Fabric Analytics Engineer Associate (DP-600)
- Microsoft Certified: Fabric Data Engineer Associate (DP-700)
- Experience with CI/CD practices for data pipelines
- Familiarity with version control systems such as Git
- Exposure to real-time streaming data solutions
- Experience working in Agile or Scrum environments
- Strong communication skills with the ability to translate technical concepts for non-technical stakeholders
What We Offer
- Competitive salary and performance-based bonuses
- Flexible hybrid options
- Opportunities for professional development, training, and certification sponsorship
- A collaborative, innovation-driven team culture
- Paid time off and company holidays
Job Description:
- Provide pre-sales technical assistance and product education to customers.
- Respond to inquiries and resolve issues promptly to ensure customer satisfaction.
- Collaborate with the sales team to develop and implement effective sales strategies.
- Identify opportunities for up-selling and cross-selling our products and services.
- Utilize engineering knowledge to understand customer needs and recommend suitable products or solutions.
- Conduct product demonstrations and technical presentations.
- Prepare and deliver accurate and competitive sales quotations, proposals, and contracts.
- Ensure all customer communications are clear, professional, and effective.
- Build and maintain strong relationships with existing and potential customers.
- Act as the primary point of contact for customer inquiries and follow up on leads.
- Work closely with the engineering, marketing, and manufacturing, accounts and logistics teams to ensure alignment between customer needs and product offerings.
- Track and report on sales activities, including lead generation, customer interactions, and sales performance.
- Provide regular updates to senior management
Business Development Intern - Come Grow with Us! 🚀💼
Hybrid, New Delhi
Theo Hackie: Where creativity meets data, and magic happens! 🌟
We're North India’s premier 360° marketing solutions agency, shaking things up since 2016! 🚀 We've fueled growth for 30+ clients, from government bodies to global brands and high-growth startups. How? By blending creativity, analytics, and client-first thinking to deliver killer results!
Your Mission:
Join our tribe as a Business Development Intern and be the rainmaker we've been searching for! ☁️ You'll drive business growth by generating leads, closing new clients, and expanding opportunities in the marketing, branding, and digital campaigns space.
Your Superpowers:
- Generate and qualify leads, building a robust pipeline for agency services 📈
- Conduct sales outreach, pitch agency solutions, and close deals with B2B clients 🤝
- Identify growth opportunities with existing clients and upsell relevant services 📊
- Collaborate with internal teams (creative, digital, strategy) to tailor solutions 🌈
- Analyze market trends, competitors, and client needs to refine sales strategies 📊
- Meet or exceed monthly/quarterly sales targets and KPIs 🔥
- Maintain accurate records of interactions and sales in CRM systems 📁
What You Bring:
- 1 year of experience in sales for a marketing, advertising, or digital agency ( freshers are also welcome)📚
- Proven track record of achieving and exceeding sales targets 🏆
- Strong skills in business development, lead generation, and client acquisition 💡
- Excellent communication, negotiation, and presentation skills 🗣️
- Ability to work independently in a remote environment and manage multiple clients 💻
- Bachelor’s degree in Business, Marketing, or a related field 📚
- Experience with B2B marketing campaigns, digital marketing services, and branding solutions is a plus 🌟
We're 180° - the smarter half that flips everything.

Core Focus:
- Operate with a full DevOps mindset, owning the software lifecycle from development through production support.
- Participate in Agile ceremonies and global team collaboration, including on-call support.
Mandatory/Strong Technical Skills (6–8+ years of relevant experience required):
- Java: 4.5 to 6.5 years experience
- AWS: Strong knowledge and working experience with Cloud technologies minimum 2 years.
- Kafka: 2 years of Strong knowledge and working experience with data integration technologies
- Databases: Experience with SQL/NoSQL databases (e.g., Postgres, MongoDB).
Other Key Technologies & Practices:
- Python, Spring Boot, and API-based system design.
- Containers/Orchestration (Kubernetes).
- CI/CD tools (Gitlab, Splunk, Datadog).
- Familiarity with Terraform and Airflow.
- Experience in Agile methodology (Jira, Confluence).
Job Title: Python Django Microservices Lead
Job Title: Django Backend Lead Developer
Location: Indore/ Pune (Hybrid - Wednesday and Thursday WFO)
Timings - 12.30 to 9.30 PM
Experience Level: 8+ Years
Job Overview: We are seeking an experienced Django Backend Lead Developer to join our team. The ideal candidate will have a strong background in backend development, cloud technologies, and big data
processing. This role involves leading technical projects, mentoring junior developers, and ensuring the delivery of high-quality solutions.
Responsibilities:
Lead the development of backend systems using Django.
Design and implement scalable and secure APIs.
Integrate Azure Cloud services for application deployment and management.
Utilize Azure Databricks for big data processing and analytics.
Implement data processing pipelines using PySpark.
Collaborate with front-end developers, product managers, and other stakeholders to deliver comprehensive solutions.
Conduct code reviews and ensure adherence to best practices.
Mentor and guide junior developers.
Optimize database performance and manage data storage solutions.
Ensure high performance and security standards for applications.
Participate in architecture design and technical decision-making.
Qualifications:
Bachelor's degree in Computer Science, Information Technology, or a related field.
8+ years of experience in backend development.
8+ years of experience with Django.
Proven experience with Azure Cloud services.
Experience with Azure Databricks and PySpark.
Strong understanding of RESTful APIs and web services.
Excellent communication and problem-solving skills.
Familiarity with Agile methodologies.
Experience with database management (SQL and NoSQL).
Skills: Django, Python, Azure Cloud, Azure Databricks, Delta Lake and Delta tables, PySpark, SQL/NoSQL databases, RESTful APIs, Git, and Agile methodologies
We have a farm at Denkennikottai, Tamil Nadu where we grow varieties of Roses and Fruits. We want to sell flowers to Apartments on a subscription basis. Need a Sales and Marketing person who would be talking to residents of apartments and should be able to sell flowers. And deliver the flowers to certain collection points (It is not a door-to-door delivery)
Job Description
- Visit apartments and talk to people to generate leads to sell bunches of flowers on a weekly/monthly/yearly subscription basis.
- Should be able to visit the farm (around 70Kms) and collect flowers once a week.
- Deliver the flower bunches at different collection points
- Talk to supermarkets and other stores to establish collection points near the apartments
Requirements:
- Should be fluent in Kannada and English
- Must have a vehicle to travel to the farm
- Should be from Bangalore
- Must be dynamic and good at interacting with people
Perks:
- There will be a fixed salary of 10k to 12k
- There will be commissions based on new sales
- Petrol expenses will be reimbursed for the farm visit












