- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.

About Molecular Connections
About
Connect with the team
Similar jobs
Role & Responsibilities
As a Senior GenAI Engineer you will own the AI layer of our product — building the features that make Zenskar intelligent. This is not a research role and not a prompt-engineering role. You will build production AI systems that enterprise clients depend on, which means reliability, observability, and rigorous evals matter as much as the AI capability itself. You own the full vertical — the model, the pipeline, and the UI.
- Build and own CS Copilot — a real-time assistant for customer success teams, spanning STT pipelines, live transcription, and LLM-powered suggestions
- Build LLM-powered document understanding features — extracting structured, reliable data from unstructured enterprise documents
- Own AI feature UIs end-to-end — you build the interface, not just the model integration layer
- Design and maintain an eval framework — define what 'working' means for each AI feature and catch regressions before users do
- Drive model selection and integration decisions — choosing the right provider and approach for each use case, managing latency and cost
- Own AI platform reliability — observability, fallback behaviour, and graceful degradation when models fail
- Work closely with product, customer success, and the full-stack engineer — AI features only matter if they are usable and trusted by real users
THE IMPACT YOU'LL MAKE-
- You will define what AI means at Zenskar — the features you ship will be the most visible and differentiated parts of the product
- CS Copilot, if done well, changes how enterprise customer success teams operate every single day — this is a high-stakes, high-visibility surface
- You will establish the engineering culture around AI reliability at Zenskar — evals, observability, and disciplined iteration
- Your work will directly accelerate enterprise deals — AI features are increasingly a buying criterion for our clients
- You will be the person who brings engineering rigour to a domain where most companies ship demos and call it a feature
Ideal Candidate
- Strong Senior GenAI / AI Backend Engineer Profiles
- Mandatory (Experience 1) – Must have 4+ years of total software development experience, with at least 2+ years working on AI/LLM-based features in production
- Mandatory (Experience 2) – Must have strong backend engineering experience using Python (FastAPI / Django preferred) and building production-grade systems
- Mandatory (Experience 3) – Must have hands-on experience building LLM-based applications, including OpenAI / Gemini / similar models in real projects
- Mandatory (Experience 4) – Must have experience with RAG (Retrieval Augmented Generation) including chunking, embeddings, and retrieval pipelines
- Mandatory (Experience 5) – Must have experience designing end-to-end AI pipelines, including chaining, tool usage, structured outputs, and handling failure cases
- Mandatory (Experience 6) – Must have experience building agentic AI systems (multi-step workflows, tool orchestration like LangGraph / CrewAI or custom agents)
- Mandatory (Experience 7) – Must have strong coding and system design skills, not just prompt engineering or experimentation
- Mandatory (Experience 8) – Must have experience shipping AI features in production, not just POCs or research projects
- Mandatory (Experience 9) – Must have experience working with APIs, backend services, and integrations
- Mandatory (Experience 10) – Must have understanding of AI system reliability, including latency, cost optimization, fallback handling, and basic eval thinking
- Mandatory (Company) – Product companies / startups, preferably Series A to Series D
- Mandatory (Note) - Candidate's overall experience should not be more than 7 Yrs
- Mandatory (Tech Stack) – Strong in Python + AI/LLM ecosystem, experience with modern AI tooling and frameworks
- Mandatory (Exclusion) – Reject profiles that are only Prompt Engineers, Data Scientists, or Frontend Engineers without strong backend + system building experience
- Preferred (Skill) – Experience with fine-tuning (LoRA / QLoRA) or open-source model deployment (vLLM / Ollama)
- Preferred (Frontend) – Basic ability to build or contribute to frontend (React or similar)
- Highly Preferred (Education) – Candidates from Tier-1 institutes (IITs, BITS, NITs, IIITs, top global universities)
Profile: Senior Java Developer
🔷 Experience: 5+ Years
🔷 Location: Remote
🔷 Shift: Day Shift
(Only immediate joiners & candidates who have completed notice period)
✨ What we want:
✅ AWS cloud services (MANDATORY)
✅ Docker containerization (MANDATORY)
✅ Spring/Spring Boot framework
✅ RESTful API development
✅ Microservices architecture
✅ Database experience (SQL/NoSQL)
✅ Git version control & CI/CD
✅ Kubernetes orchestration
Key Responsibilities:
Collect, analyze, and maintain data related to infrastructure and e-Governance projects.
Prepare and update MIS reports, dashboards, presentations, and financial analyses.
Study business processes, organizational structures, and assist in drafting EoIs, RFPs, and
RFQs.
Support all phases of the project lifecycle, from initiation to post-implementation.
Define requirements, timelines, costing, payment terms, and service levels.
Monitor project status, SLAs, issue resolution, and vendor performance.
Assist in all departmental e-Governance initiatives and related tasks.
Key Skills:
Language required: English, Hindi and Marathi
Experience working with the infrastructure domain is preferable.
Strong analytical and reporting skills
Understanding of government systems and processes
Experience in project documentation, costing, and vendor management
Proficiency in Excel, PowerPoint, and project tools
We are looking for an experienced and result-driven Business Development Executive who will be in-charge of leading our company's sales efforts. As a successful hire, you will be responsible for acquiring' new customers to rapidly increase revenue for the company.
Responsibilities:
- Hunt /acquire new customers.
- Independently manage the whole sales pipeline - from lead generation -> lead qualification > product presentation -> negotiation -> closure and contract signing.
- Making sales pitches and attending sales meetings with prospective clients.
- Ensuring that the sales pipeline is constantly being built and worked upon.
- Doing sufficient research to make effective pitches.
- Build long-term relationships with new and existing customers.
- Refine sales approach and meticulously execute on it (target industries, target buyers, and target personas).
- Keep self educated about competitors, and emerging trends among the customers.
- Partner with marketing, demand generation, and onboarding team to manage the customer acquisition.
Requirements:
- MBA with 2-3 years' experience in B2B sales role, preferably in automotive industry.
- Proven ability to achieve targets.
- Proficiency with MS Office Suite, particularly MS Excel & Powerpoint.
- Ability to draft persuasive proposals.
- Extensive, hands-on experience selling to large and mid-sized corporates.
- Should have a track record of consultative selling.
- Strong in building rapport with customers.
- Excellent communication skills - you are a clear and compelling storyteller across written and verbal mediums.
- Strong listening and problem-solving skills - you invest time in helping customers identify best solutions for their problems, and not force your solution upon the customer.
- Ability to negotiate confidently with mid & senior management folks.
- Organized and task driven.
Technical:
- Expertise in WordPress customization, development and knowledge of plugins and API integration Should be well versed in WordPress.
- Must have knowledge to create their own theme and able to do customization in existing/premium theme.
- Must have experience of plugin customization and also aware about a new plugin development.
- Must have the Knowledge of existing WordPress’s functions, hooks, plugins.
- Experience in WordPress is mandatory, other Open Sources/frameworks experience would be considered as an added advantage.
- WordPress integration directly from PSD/AI/Sketch would be considered as an added advantage.
- Must have knowledge of basic php and MYSQL concepts.
Non-Technical:
- Interpersonal skills
- Good communication
- Decision making
- Good Team player
- Good Listener
Roles & Responsibilities:
- Designing and implementing new features and functionality
- Establishing and guiding the website’s architecture
- Ensuring high-performance and availability, and managing all technical aspects of the CMS
- Helping to formulate an effective, responsive design and turning it into a working theme and plugin.
Candidate Profile:
Role: Marketing Campaign Coordinator
Vertical: Marketing
Education: Any graduate or postgraduate but MBA - Marketing( preferred)
Understanding of: Marketing Campaigns, IT Sales process, CRM system, Lead sourcing
Total Experience: Around 2 years
Overview of the Company
Armentum is an IT Services company that provides the following services.
- Website Design and Development
- Web and Mobile Application Development (LAMP and MEAN/MERN Stack)
- SEO Services
and our development center is based in Bangalore, India. Our total team strength is over (50) people. Our team has deep expertise in building large scale applications in the finance, real estate, and banking industry.
Specific Responsibilities
- Excellent Communication Skills and Content Writing skills - this is non-negotiable and is a must.
- Will be responsible for planning and managing all Inbound and Outbound outreach campaigns that are a mixture of email marketing and telemarketing
- Having prior knowledge of working with Product Marketing
- Coordinate with Telemarketer to make the follow-up to convert sourced leads into appointments
- Maintaining the KPI for the team and the campaigns, drafting an analytics report every month and evaluate the performance with the growth manager and do course correction where required.
- Prior knowledge of working on any CRM for an end to end marketing and sales campaigns
- Promptly responding to emails from potential clients, coordinating with the sales team to ensure the appointment is set up, and to ensure all sales collaterals are ready before a sales call.
Ideal Candidate Description
- 1.5-2 years of working in Marketing services specifically in IT Product Marketing.
- Excellent Writing and Communication Skills
- Proficient using tools like MS Office & Google Drive
- Must possess knowledge of CRM systems and how they operate
- Excellent presentation skills
- Ability to learn things quickly
- Industrious, Orderly and Disciplined
- Must be able to meet tight deadlines
- Excellent documentation skills
- Ability to effectively communicate with different teams
- Quick thinker, ability to get things quickly, and to manage time
* Write “clean”, well-designed code
* Produce detailed specifications
* Troubleshoot, test and maintain the core product software and databases to ensure strong optimization and functionality
* Contribute in all phases of the development lifecycle
* Follow industry best practices
* Develop and deploy new features to facilitate related procedures and tools if necessary
Requirements:
* Proven software development experience in PHP
* Understanding of Frameworks like Laravel, CodeIgnitor, Wordpress, Yii, etc
* Demonstrable knowledge of web technologies including HTML, CSS, Javascript, AJAX etc
* Good knowledge of relational databases, version control tools and of developing web services
* Experience in common third-party APIs (Google, Facebook, Ebay etc)
* Passion for best design and coding practices and a desire to develop new bold ideas
* BS/MS degree in Computer Science, Engineering or a related subject

About the Role:
We are looking for an experienced Full Stack Developer, focused on writing well structured, efficient and
Maintainable software. You will be working on writing code, tests and documentation for the Network
Management and Head End Systems. You will be interfacing with devices in the field, collecting data from
Electricity meters and networking equipment. Your responsibility will include the data collection services,
External APIs, user interface and data analytics. You will be committed to making a high-quality, fault tolerant
System to interact with physical devices and make collected data available to 3rd party services.
Qualification & Experience:
- Bachelor’s/Master’s degree in Computer Engineering with 2+ experience writing high quality, efficient and
maintainable code.
- Proficiency with Elixir or Erlang. Alternatively experience with at least two of Ruby on Rails, NodeJS, Python,
Lisp/Clojure, Scala, Haskel or similar
- Proficiency with Typescript or Javascript using ReactJS, VueJS or similar.
- Good knowledge of SQL databases (PostgresSQL, MSSQL, Oracle or similar)
- Good knowledge of revision control system like GIT, SVN, TFS or similar
- Experience with NoSQL databases like InfluxDB, Prometheus, Elastic Stack, SOLR or similar is a plus.
- In depth knowledge of Linux would be a plus
- Exposure to schematic and layout design and understanding would be a plus.
- Experience working with the energy metering segment is a plus.
Roles & Responsibilities:
- The selected candidate will be handling software development activities in Elixir and Typescript
- Technical development activities as per client requirements and internal operational processes.
- Coordinate with multi geographical teams for technical development.
- Close coordination with Technical Support team to visit customers and understand
Requirements.
- Maintaining/writing high quality code, tests and documentation
Location: Mohali
Compensation: Commensurate with the experience










