- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.

About Molecular Connections
About
Connect with the team
Similar jobs

JOB DETAILS:
Job Role: Lead I - .Net Developer - .NET, Azure, Software Engineering
Industry: Global digital transformation solutions provider
Work Mode: Hybrid
Salary: Best in Industry
Experience: 6-8 years
Location: Hyderabad
Job Description:
• Experience in Microsoft Web development technologies such as Web API, SOAP XML
• C#/.NET .Netcore and ASP.NET Web application experience Cloud based development experience in AWS or Azure
• Knowledge of cloud architecture and technologies
• Support/Incident management experience in a 24/7 environment
• SQL Server and SSIS experience
• DevOps experience of Github and Jenkins CI/CD pipelines or similar
• Windows Server 2016/2019+ and SQL Server 2019+ experience
• Experience of the full software development lifecycle
• You will write clean, scalable code, with a view towards design patterns and security best practices
• Understanding of Agile methodologies working within the SCRUM framework AWS knowledge
Must-Haves
C#/.NET/.NET Core (experienced), ASP.NET Web application (experienced), SQL Server/SSIS (experienced), DevOps (Github/Jenkins CI/CD), Cloud architecture (AWS or Azure)
.NET (Senior level), Azure (Very good knowledge), Stakeholder Management (Good)
Mandatory skills: Net core with Azure or AWS experience
Notice period - 0 to 15 days only
Location: Hyderabad
Virtual Drive - 17th Jan
- Strong AI/ML OR Software Developer Profile
- Mandatory (Experience 1) - Must have 3+ YOE in Core Software Developement (SDLC)
- Mandatory (Experience 2) - Must have 2+ years of experience in AI/ML, preferably in conversational AI domain (spped to text, text to speech, speech emotional recognition) or agentic AI systems.
- Mandatory (Experience 3) - Must have hands-on experience in fine-tuning LLMs/SLM, model optimization (quantization, distillation) and RAG
- Mandatory (Experience 4) - Hands-on Programming experience in Python, TensorFlow, PyTorch and model APIs (Hugging Face, LangChain, OpenAI, etc
Focus: General PTE and IELTS
Schedule: Flexible. Must be available from 6:30 AM IST when required.
Compensation: 10 AUD per hour
Job Summary
We are looking for an experienced and dedicated PTE / IELTS Tutor to deliver effective online training sessions for students preparing for General PTE and IELTS exams. The ideal candidate understands exam structures, scoring patterns, and coaching strategies that help learners consistently hit their target scores.
Key Responsibilities
- Conduct engaging online classes for General PTE and IELTS learners.
- Create personalized lesson plans based on student proficiency.
- Teach all four modules. Speaking, Listening, Reading, and Writing.
- Run mock tests and provide detailed performance feedback.
- Track student progress and share clear improvement strategies.
- Explain exam formats, marking criteria, and time management methods.
- Maintain smooth and professional communication with students and management.
Requirements
- Proven experience teaching PTE and. or IELTS.
- Strong understanding of General PTE test format.
- Excellent English communication skills.
- Ability to teach online via Zoom, Google Meet, or similar tools.
- Availability at 6:30 AM IST when needed.
- Strong interpersonal and motivational skills.
Preferred Qualifications
- PTE score of 79+ or IELTS 8.0+.
- Previous online tutoring experience.
- TESOL, TEFL, or related certification is a plus.
Benefits
- Competitive pay at 10 AUD per hour.
- Flexible working hours.
- 100 percent remote. Work from home.
- Opportunity to train students from diverse international backgrounds.
Key Responsibilities
- Flow Development & Automation
- Develop, maintain, and enhance CAD automation scripts and flows for physical design (place-and-route, timing closure, physical verification, etc.).
- Integrate and validate EDA tools for synthesis, floorplanning, clock tree synthesis, routing, and sign-off.
- EDA Tool Support
- Work closely with design teams to debug and resolve CAD/EDA tool issues.
- Collaborate with EDA vendors for tool evaluations, feature requests, and bug fixes.
- Physical Verification & Sign-Off
- Build and maintain flows for DRC, LVS, ERC, IR drop, EM, and timing sign-off.
- Ensure physical design flows meet foundry requirements and tapeout schedules.
- Methodology Development
- Develop best practices and guidelines for efficient design closure.
- Evaluate new EDA technologies and propose improvements to existing workflows.
Our client is currently looking for a Data Solutions Associate Director to join the data, technology & analytics division.
- Support agencies on client brief responses, working with the analytics & technology directors to develop tailored solutions using proprietary or partner technologies.
- Conduct client data & technology audits for development of audience strategies.
- Deliver detailed digital transformation roadmaps to support audience activation, provided in a client-ready format for agency partners to utilize.
- Project manage all signed-off SOW’s, from initiation to closure and ensure effective delivery.
- Work closely with agency planning teams to identify opportunities for clients within data & tech service scope.
- As the SME on data marketplaces, be the POC for buying teams with enquiries on data strategy for media activation (source, validate, build).
- Develop agency data ‘champions’ in your respective market, through training and enablement programs covering Identity resolutions, data governance and ethics policies, as defined by our global and legal teams.
- Work with the Regional & Marketing teams to create & deliver Learning & Development sessions on our partner solutions to buying and agency teams.
- Lead and develop key relationships with relevant data & technology partners, working with investment & agencies to deliver opportunities that support our client and internal requirements.
- Any other ad-hoc projects or tasks relevant to the position.
- Build a great working relationship with your team, market teams and client team, operating seamlessly together to deliver success for our clients, across the briefing, booking, set up, optimization and reporting processes.
Requirements
- Bachelor’s degree in a relevant quantitative field (Engineering, Computer Science, Analytics)
- 10+ years in a data & technology consulting role, with expertise around developing client data strategies (eg CRM, 1st party, 3rd party, CDP/DMP’s) for media activation.
- Thorough understanding of digital marketing channels (specifically search, social & programmatic) & digital media metrics
- Expert knowledge on the data landscape and relevant topics within the industry.
- Be a strategic problem solver, with the ability to understand client business challenges and develop creative, effective & measurable solutions
- High-level stakeholder management capabilities, who can influence a diverse range of teams and individuals
- Experienced trainer, having worked with marketing teams to deliver learning & development programs.
- Outstanding communication skills, both written and verbal. Must be comfortable presenting to clients if needed.
- Strong data & tech partner management capabilities, with established relationships in place with key partners (eg Data onboarders, aggregators, panel providers, etc)
- Highly organised, detail orientated, QA-focused with demonstrated project management capabilities
- Flexibility to work in a cross-functional team but also have the initiative to problem solve independently
- The following will be highly regarded:
- Certifications (in addition to demonstrated experience) with Marketing Technology platforms (eg SFMC, AEC)
- Advanced knowledge of clean room solutions
- Hands-on experience in either a technology or analytics role, within an agency or management consulting firm.
- Commercial experience within Data & Tech consulting (eg package service offerings to generate revenue, develop ratecards, SOW’s etc)
FullStack Developer
We are currently building the Technology platform for the Global Air Cargo industry ($300B market).
Currently, we have two open roles (1 Front End Engineer & 1 Back End Engineer).
Full Stack Engineers are also desired for the open positions.
These (2) engineers will be part of the Tech team which will be at the core of what we aspire to build.
Looking for Full Stack / Front End / Back End Developer
- Tech Stack (React, node.js, typescript, postgreSQL, AWS)
- loves to code, get hands-on and likes to build products from scratch
- At least 2+ years of experience building software products
- "Build local, Launch global" mindset
- Good with people and teams +
- prior startup experience +
- SaaS platform expertise +
- Work location - Chennai
- So we are looking only for candidates in Chennai or who are open to relocating to Chennai for this opportunity
- Open to explore both Contract & Full time options for the (2) roles
Responsibilities:
. Handling/Maintaining Client Relations and communication
.Creative and innovative concerning Idea Generation for all kinds of branding activities for the client
· Excellent knowledge of branding and promotional activities for all leading social media platforms such as Facebook, Twitter, Instagram, LinkedIn etc.
· Assist with the planning of marketing strategies to help drive traffic and engagement to the website
· Produce content for social media channels such as Facebook, Twitter, Instagram, LinkedIn, Pinterest etc.
· Keep up to date with any social media trends
· Track social media influencers
· Ensure you produce a consistent brand message,captions , and content across all the social media channels
· Regularly monitor competitor social media sites and create competitor analysis reports
· Contribute to the company blog
· Assist with social media performance reports
· Engage with social media users and respond to any mentions over Facebook, Twitter, Instagram, LinkedIn, Pinterest etc.
· Writing effective SEO content for blogs, websites and social media accounts.
· Coordination with the digital marketing team to report findings and recommendations.
Qualifications:
· Graduates with a minimum of 3-4 years of experience.
· Outgoing and easy to work with.
· Substantial social media presence.
· Ambitious and hardworking.
· Knowledge of tools for designing and video editing is an added advantage.
· Excellent communication skills
Relevant experience in determining a target audience and how to cater to unique marketing campaigns to capture their attention.
- Expert with DeFi, NFT and latest Crypto Concepts. Should have working experience on these concepts
- Strong knowledge with Smart Contract best practices
- Experience with React JS & Node JS Required (We prefer MERN Stack). Having TypeScript experience would be an advantage
- If you have understanding of AWS and how setup and deployment of Project elements work, would be an advantage.
- Knowledge of GitHub is must
- Handle the solidity
- Should be able to write smart contracts
- Education: Not specific
- Skills: Exp. In working polkadot & binance smart chains will be an advantage.
The brand is associated with some of the major icons across categories and tie-ups with industries covering fashion, sports, and music, of course. The founders are Marketing grads, with vast experience in the consumer lifestyle products and other major brands. With their vigorous efforts toward quality and marketing, they have been able to strike a chord with major E-commerce brands and even consumers.
What you will do:
- Collaborating with the UX and UI design teams to produce seamless, robust and innovative front-end user experiences
- Working closely with Project Managers to deploy project requirements
- Managing multiple projects simultaneously and be able to address their specific needs and requirements at a moment's notice
- Understanding business needs that drive project features and functions and provide internal consultation
Desired Candidate Profile
What you need to have:- Bachelor's Degree in IT, Computer Science, Computer Programming or a similar field
- 2+ years of experience developing within the Shopify and Shopify Plus platforms
- 2+ years of experience in front-end technologies including, but not limited to, JavaScript, AJAX, HTML, CSS, SASS, XML
- Troubleshooting and debugging skills
- Interest in staying current and applying the most current best practices
- Ability to work effectively in a fluid, fast-paced environment
- A true passion for design and technology
- A disciplined approach to testing and quality assurance
- Positive attitude, high energy, and love for broadening web development skill set








