- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.

About Molecular Connections
About
Connect with the team
Similar jobs
Key Responsibilities:
Design, develop, and maintain efficient, reusable, and reliable Go code.
Implement and integrate with back-end services, databases, and APIs.
Write clean, scalable, and testable code following best practices and design patterns.
Collaborate with cross-functional teams to define, design, and ship new features.
Optimize application performance for maximum speed and scalability.
Identify and address bottlenecks and bugs, and devise solutions to these problems.
Stay up-to-date with the latest industry trends, technologies, and best practices.
Required Qualifications:
Proven experience as a Golang Developer or similar role in software development.
Proficiency in Go programming language, paradigms, constructs, and idioms.
Experience with server-side development, microservices architecture, and RESTful APIs.
Familiarity with common Go frameworks and tools such as Gin.
Knowledge implementing monitoring, logging, and alerting systems
Experience with SQL and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB).
Understanding of code versioning tools, such as Git.
Strong understanding of concurrency and parallelism in Go.
Experience with cloud platforms (AWS, GCP, Azure) and containerization (Docker, Kubernetes) is a plus.
Excellent problem-solving skills and attention to detail.
Ability to work effectively both independently and as part of a team.
We are looking for passionate and highly driven Academic Counselors to join our dynamic team. The ideal candidate will play a pivotal role in guiding students in their upskilling journey by recommending suitable courses in Data Science and other emerging Tech domains. If you have a flair for sales, excellent communication skills, and a passion for education, we'd love to meet you.
MINIMUM 1 year of ED TECH SALES experience required.
Responsibilities:
- Counsel prospective learners via phone, email, or in-person to help them understand the value and benefits of our tech programs.
- Drive sales conversion by identifying student needs and recommending suitable programs.
- Manage the complete sales cycle: lead qualification, counseling, follow-ups, and closure.
- Maintain accurate records of leads, interactions, and outcomes using CRM tools.
- Meet or exceed monthly targets and KPIs.
- Provide detailed course information, industry insights, and address queries effectively.
Requirements:
- Bachelor's degree in any field.
- MINIMUM 1 year of ED TECH SALES experience required.
- Excellent verbal and written communication skills in English.
- Strong interpersonal skills with the ability to influence and negotiate.
- Self-motivated and target-driven with a problem-solving attitude.
- Tech-savvy and comfortable working in a fast-paced environment.
Preferred Qualifications:
- Prior experience in selling Data Science, Tech, or Professional Certification Courses is a plus.
- Knowledge of CRM software (e. g., Salesforce, Zoho) preferred.
We are looking for somebody to take care of our social media handles and run campaigns to generate leads
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required
- Experience in the manufacturing industry (metal industry is a plus)
- 2+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
Job description
- Defining business requirements and reporting them back to stakeholders
- Understanding tasks and creating BRDs(Business Requirement Documentation)
- Creating Functional Specification Documents
- Understanding requirements as communicated by business stakeholders
- Understanding stakeholder briefs clearly and propose suitable solutions in consultation with the team
- Build the required project documentation, workflows and FSDs(Functional Specification Document), SRS(Software Requirements Specification)/SOWs(Scope of Work) in collaboration with the stakeholders for the project
- Maintain the SRS documentation during the course of the project
- Understand and manage change requests
- Develop and actively collaborate with project managers and the team on plans and provide weekly status and tracking reporting on the health of the project.
- Provide clear, consistent and timely messaging; both verbal and written; within the team and to technology and business leadership; facilitate effective and timely meetings
- Qualifications and skills
- Minimum 1 year of experience with projects on NodeJs or similar backend technologies
- Understanding of Microservices Architecture
Experience: 1 – 3.5 years
Mandatory Skills
- Should have good knowledge in Object Oriented Programming concepts.
- Good knowledge in ASP.NET, MVC and C#
- Understanding of JavaScript, HTML5 and CSS.
- Good Knowledge in SQL Server and RDBMS concepts.
Desired Skills set:
- Create systems with .NET framework 4.x / ASP.NET MVC / SQL Server 2008-2016 / WCF / Web Services / WebAPI based on the design specification.
- Ability to create and optimize SQL Server 2008-2016 stored procedures.
- Experience with JQuery or similar technologies.
- Knowledge on .Net Core is an added advantage.
Description:
- Understanding on performing analysis and design, and develop the small/large-scale systems.
- Responsible for developing new programs and proofing the program to develop needed changes to assure production of a quality product.
- Tests the written programs to ensure that logic and syntax are correct, and that program results are accurate.
- Document code consistently throughout the development process by listing a description of the program, special instructions, and any changes made in database tables on procedural, modular and database level.
- Strong grasp of OO principles
- 1+ years of development experience in OO languages such as C# and MVC architecture.
- Sound knowledge of SQL and database technologies
- Knowledge of Agile Methodologies such as Extreme Programming (XP) & Scrum
At Prolifics, we are currently implementing multiple solutions and we are looking to hire talented UIPath RPA Engineer candidates for our development centre in India. This position would be based out of Hyderabad/Pune and is a permanent position.
If you are looking for a high growth company with rock-solid stability, if you thrive in the energetic atmosphere of high-profile projects, we want to talk to you today! Let’s connect and explore possibilities of having you onboard the Prolifics team!
Job Title: Sr. UIPath RPA Developer
Experience: 6+ years
Location: Hyderabad/ Pune
Fulfilment Type: Fulltime
Job Description:
1. Strong UiPath Development skills
2. Strong communication skills
3. Design, Develop, Build, Deploy and Manage automation BoTs in UiPath environment
4. UiPath citizens developer with skills set on Low complexity automation is good to have
5. Knowledge/Understanding with business process requirements and experience configuring automation processes in UiPath
6. Strong techno-functional skills and experience on working with business to gather requirements. Working with testing teams like UAT to perform testing
7. Experience working with infrastructure teams to guide and assist in issue resolution
8. Experience working in cross functional teams involving cloud environments, to be able to work with security teams and firewall teams for any code issue resolution
9. Knowledge/Experience building BoT with automation on SAP ERP, Finance, Supply Chain, SAP Ariba, Vendor On-boarding, GFE+, Logistics, PKMS – Warehouse Management is nice to have.
About us:
Prolifics Corporation Limited is a Global Technology Solutions Provider with presence across North America (USA and Canada), Europe (UK and Germany), Middle East & Asia. In India, we have off shore development centers: 3 in Hyderabad & 1 in Pune. For more than 40 years, Prolifics has transformed enterprises of all sizes including over 100 Fortune 1000 companies by solving their complex IT challenges. Our clients include Fortune 50 and Fortune 100 companies across a broad range of industries including Financial Services, Insurance, Government, Healthcare, Telecommunications, Manufacturing and Retail. We rank consistently in Dream Companies to Work For and Dream Employer of the Year ranking from World HRD Congress, ranked 7 in 2019. We encourage you to visit us on www.prolifics.com or follow us on Twitter, LinkedIn, Facebook, YouTube and other social media to know more about us.
Vyapar is a SaaS-Based Startup Company located at HSR layout, Sector 1, Bangalore. Recently we have completed SERIES B funding of 200+ Cr where our lead investors are Indiamart & Westbridge Capital.
Who uses Vyapar?
Vyapar is a Billing, accounting, inventory management & Online Store software used by almost every
type of businesses in India, be it Manufacturers, Distributors & Retailers.
What is the role about?
Job description
● Call Customer within 2 hours of Lead assignment and take appointment for Demo of SaaS
● Provide Detailed Demo to Customer and clarify doubts based on the business use case
● Engage, Follow Up and Sale subscription for SaaS
● Following Sales SOP.
Desired Candidate Profile
● Self-motivated with good communication skills, negotiation skills and zeal to perform.
● Fluent in Hindi and English language
Perks & Benefits
● Medi claim, Gratuity, sales incentive, attendance incentive.
- Must have one or two years of experience in client handling.
- Comfortable working for B2B clients.
- Must have good sales and negotiation skills.
- Must have relationship-making skills.
- Need to Onboard New channel Partners/ Distributors.
- Need to follow the SOP while onboarding Partners.
- Need to guide the partner about the partner program along with the Product.
- Responsible for partners' Business and all the processes related to the partner Program.
- Need to motivate partners to do an acquisition.
- Need to maintain long-term relationships with assigned and onboarded partners.
Our client is a Community Commerce company that focuses on fashion and accessories. Their community network and technology is leading a rejig of Fashion retail and supply chain in India. Their network creates users as buyers and sellers at the same time, offering unbeatable prices on products and rewards for sharing deals across social media.
The founders are alumnus of prestigious tech and business institutes, with expertise and experience with ECommerce and distribution facilities. They have ensured quality and fashion with the factory price tag, that works best when shared rapidly with communities on social networking.
- Creating & managing content for app, website & overall internet presence
- Creating innovative content experiences that would attract and delight target users and set us apart from the competition.
- Creating content ideas that users can read, view, watch, interact with and share.
- Delivering engaging content on a regular basis
- Managing the execution of content needs through designer and video agency
- Analyzing content performance metrics and driving optimization.
- Researching on product-related features to use them in creating content for users
What you need to have:
- Any Graduation
- Creative thinking along with excellent communication skills
- Ability to work methodically and meet deadlines
- Ability to work independently and as part of a team
We are looking for Senior Tally Developer to integrate our platform with Tally.
Job responsibilities will be cross-functional - ranging from "clean" coding to architectural changes to client interactions. Over a period of time, you're expected to challenge the existing tech stack and add your baby components to it. A person should be a team player, should have an eye for detail and problem-solving skills.
Requirements:
- 4+ Years of experience as a Tally Developer
- Insightful knowledge in building TDLs for Tally
- Familiarity with Agile development methodologies.
- Do high-level technical design with guidance, functional modeling, break-down of the module by thinking platforms, and re-use.
- Sense of ownership and attention to detail.








