
Similar jobs
Job Title : Senior Backend Engineer – Java, AI & Automation
Experience : 4+ Years
Location : Any Cognizant location (India)
Work Mode : Hybrid
Interview Rounds :
- Virtual
- Face-to-Face (In-person)
Job Description :
Join our Backend Engineering team to design and maintain services on the Intuit Data Exchange (IDX) platform.
You'll work on scalable backend systems powering millions of daily transactions across Intuit products.
Key Qualifications :
- 4+ years of backend development experience.
- Strong in Java, Spring framework.
- Experience with microservices, databases, and web applications.
- Proficient in AWS and cloud-based systems.
- Exposure to AI and automation tools (Workato preferred).
- Python development experience.
- Strong communication skills.
- Comfortable with occasional US shift overlap.
Role Overview
Join our core tech team to build the intelligence layer of Clink's platform. You'll architect AI agents, design prompts, build ML models, and create systems powering personalized offers for thousands of restaurants. High-growth opportunity working directly with founders, owning critical features from day one.
Why Clink?
Clink revolutionizes restaurant loyalty using AI-powered offer generation and customer analytics:
- ML-driven customer behavior analysis (Pattern detection)
- Personalized offers via LLMs and custom AI agents
- ROI prediction and forecasting models
- Instagram marketing rewards integration
Tech Stack:
- Python,
- FastAPI,
- PostgreSQL,
- Redis,
- Docker,
- LLMs
You Will Work On:
AI Agents: Design and optimize AI agents
ML Models: Build redemption prediction, customer segmentation, ROI forecasting
Data & Analytics: Analyze data, build behavior pattern pipelines, create product bundling matrices
System Design: Architect scalable async AI pipelines, design feedback loops, implement A/B testing
Experimentation: Test different LLM approaches, explore hybrid LLM+ML architectures, prototype new capabilities
Must-Have Skills
Technical: 0-2 years AI/ML experience (projects/internships count), strong Python, LLM API knowledge, ML fundamentals (supervised learning, clustering), Pandas/NumPy proficiency
Mindset: Extreme curiosity, logical problem-solving, builder mentality (side projects/hackathons), ownership mindset
Nice to Have: Pydantic, FastAPI, statistical forecasting, PostgreSQL/SQL, scikit-learn, food-tech/loyalty domain interest
1) Be open to learn new frameworks like Hapi.JS , Typescript , Nest.JS
2) Strong DB concepts , and hands on knowledge on MongoDB , REDIS
3) Experience working with micro-services will be a plus
4) Experience working with JWT and IAM systems will be a plus
5) Experience working with Postman , Swagger will be a plus
6) TDD knowledge is an advantage and also working with Unit Test code and familiar with test code coverage concepts.
7) Strong operating system knowledge is a plus with knowledge of how to manage threads.
8) Working experience with RabbitMQ , Kafka will be a plus
9) Strong knowledge of JS internals is a must.
You can contact me on nine three one six one two zero one three two
1) Be open to learn new frameworks like Hapi.JS , Typescript , Nest.JS
2) Strong DB concepts , and hands on knowledge on MongoDB , REDIS
3) Experience working with micro-services will be a plus
4) Experience working with JWT and IAM systems will be a plus
5) Experience working with Postman , Swagger will be a plus
6) TDD knowledge is an advantage and also working with Unit Test code and familiar with test code coverage concepts.
7) Strong operating system knowledge is a plus with knowledge of how to manage threads.
8) Working experience with RabbitMQ , Kafka will be a plus
9) Strong knowledge of JS internals is a must.
You can contact me on nine three one six one two zero one three two
🚀 About Us
At Remedo, we're building the future of digital healthcare marketing. We help doctors grow their online presence, connect with patients, and drive real-world outcomes like higher appointment bookings and better Google reviews — all while improving their SEO.
We’re also the creators of Convertlens, our generative AI-powered engagement engine that transforms how clinics interact with patients across the web. Think hyper-personalized messaging, automated conversion funnels, and insights that actually move the needle.
We’re a lean, fast-moving team with startup DNA. If you like ownership, impact, and tech that solves real problems — you’ll fit right in.
🛠️ What You’ll Do
- Build and maintain scalable Python back-end systems that power Convertlens and internal applications.
- Develop Agentic AI applications and workflows to drive automation and insights.
- Design and implement connectors to third-party systems (APIs, CRMs, marketing tools) to source and unify data.
- Ensure system reliability with strong practices in observability, monitoring, and troubleshooting.
⚙️ What You Bring
- 2+ years of hands-on experience in Python back-end development.
- Strong understanding of REST API design and integration.
- Proficiency with relational databases (MySQL/PostgreSQL).
- Familiarity with observability tools (logging, monitoring, tracing — e.g., OpenTelemetry, Prometheus, Grafana, ELK).
- Experience maintaining production systems with a focus on reliability and scalability.
- Bonus: Exposure to Node.js and modern front-end frameworks like ReactJs.
- Strong problem-solving skills and comfort working in a startup/product environment.
- A builder mindset — scrappy, curious, and ready to ship.
💼 Perks & Culture
- Flexible work setup — remote-first for most, hybrid if you’re in Delhi NCR.
- A high-growth, high-impact environment where your code goes live fast.
- Opportunities to work with Agentic AI and cutting-edge tech.
- Small team, big vision — your work truly matters here.

🚀 Were Hiring: Staff Engineer – Computer Vision & Machine Learning 🚀
📍 Location: Gurugram
💼 Experience: 7-10 Years
About the Role
We are seeking a passionate Computer Vision and Machine Learning expert to develop advanced solutions in medical imaging. If you’re excited about challenges like image segmentation, object detection, and 3D reconstruction, let’s connect!
Mandatory Technical Skills:
✅ Programming Languages: Proficiency in C++, Python, and C#.
✅ Computer Vision Expertise: Experience with OpenCV for tasks such as:
- Image Segmentation
- Object Detection
- Pattern Recognition
- 3D Reconstruction
- ✅ ML/DL Frameworks: Proficiency in TensorFlow, PyTorch; hands-on experience with models like YOLO, U-Net, ResNet, VGG.
- ✅ Image Processing: Strong understanding of:
- Color-space transformations
- Histogram/Contrast enhancement
- Morphology and frequency domain filtering
- ✅ Medical Imaging Knowledge: Familiarity with modalities like CT, MRI, Ultrasound.
- ✅ Algorithms & Data Structures: Solid problem-solving, optimization, and analytical skills.
Additional Preferred Skills:
🔹 Advanced C++ (STL, multithreading, design patterns).
🔹 Python libraries: Numpy, Pandas, Scikit-learn.
🔹 Knowledge of tools like VTK, ITK, or MITK.
🔹 Understanding of healthcare regulations (e.g., FDA, CE).
Why This Role?
You’ll build cutting-edge Computer Vision and Machine Learning solutions for healthcare, making a meaningful impact on lives.

- Experience building and managing large scale data/analytics systems.
- Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good
understanding of software design principles and architectural best practices.
- Be passionate about writing code and have experience coding in multiple languages, including at least
one scripting language, preferably Python.
- Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision
is right/wrong, and so on.
- Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
- Have exposure and working knowledge in AI environment with Machine learning experience
- Have experience working with multiple storage and indexing technologies such as MySQL, Redis,
MongoDB, Cassandra, Elastic.
- Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
- Use the command line like a pro. Be proficient in Git and other essential software development tools.
- Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
- Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana,
Graylog, StatsD, Datadog etc








