

- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.

About Molecular Connections
About
Connect with the team
Similar jobs
PortOne is re−imagining payments in Korea and other international markets. We are a Series B funded startup backed by prominent VC firms Softbank and Hanwa Capital
PortOne provides a unified API for merchants to integrate with and manage all of the payment options available in Korea and SEA Markets - Thailand, Singapore, Indonesia etc. It's currently used by 2000+ companies and processing multi-billion dollars in annualized volume. We are building a team to take this product to international markets, and looking for engineers with a passion for fintech and digital payments.
Culture and Values at PortOne
- You will be joining a team that stands for Making a difference.
- You will be joining a culture that identifies more with Sports Teams rather than a 9 to 5 workplace.
- This will be remote role that allows you flexibility to save time on commute
- Your will have peers who are/have
- Highly Self Driven with A sense of purpose
- High Energy Levels - Building stuff is your sport
- Ownership - Solve customer problems end to end - Customer is your Boss
- Hunger to learn - Highly motivated to keep developing new tech skill sets
Who you are ?
* You are an athlete and Devops/DevSecOps is your sport.
* Your passion drives you to learn and build stuff and not because your manager tells you to.
* Your work ethic is that of an athlete preparing for your next marathon. Your sport drives you and you like being in the zone.
* You are NOT a clockwatcher renting out your time, and NOT have an attitude of "I will do only what is asked for"
* Enjoys solving problems and delight users both internally and externally
* Take pride in working on projects to successful completion involving a wide variety of technologies and systems
* Posses strong & effective communication skills and the ability to present complex ideas in a clear & concise way
* Responsible, self-directed, forward thinker, and operates with focus, discipline and minimal supervision
* A team player with a strong work ethic
Experience
* 2+ year of experience working as a Devops/DevSecOps Engineer
* BE in Computer Science or equivalent combination of technical education and work experience
* Must have actively managed infrastructure components & devops for high quality and high scale products
* Proficient knowledge and experience on infra concepts - Networking/Load Balancing/High Availability
* Experience on designing and configuring infra in cloud service providers - AWS / GCP / AZURE
* Knowledge on Secure Infrastructure practices and designs
* Experience with DevOps, DevSecOps, Release Engineering, and Automation
* Experience with Agile development incorporating TDD / CI / CD practices
Hands on Skills
* Proficient in atleast one high level Programming Language: Go / Java / C
* Proficient in scripting - bash scripting etc - to build/glue together devops/datapipeline workflows
* Proficient in Cloud Services - AWS / GCP / AZURE
* Hands on experience on CI/CD & relevant tools - Jenkins / Travis / Gitops / SonarQube / JUnit / Mock frameworks
* Hands on experience on Kubenetes ecosystem & container based deployments - Kubernetes / Docker / Helm Charts / Vault / Packer / lstio / Flyway
* Hands on experience on Infra as code frameworks - Terraform / Crossplane / Ansible
* Version Control & Code Quality: Git / Github / Bitbucket / SonarQube
* Experience on Monitoring Tools: Elasticsearch / Logstash / Kibana / Prometheus / Grafana / Datadog / Nagios
* Experience with RDBMS Databases & Caching services: Postgres / MySql / Redis / CDN
* Experience with Data Pipelines/Worflow tools: Airflow / Kafka / Flink / Pub-Sub
* DevSecOps - Cloud Security Assessment, Best Practices & Automation
* DevSecOps - Vulnerabiltiy Assessments/Penetration Testing for Web, Network and Mobile applications
* Preferrable to have Devops/Infra Experience for products in Payments/Fintech domain - Payment Gateways/Bank integrations etc
What will you do ?
Devops
* Provisioning the infrastructure using Crossplane/Terraform/Cloudformation scripts.
* Creating and Managing the AWS EC2, RDS, EKS, S3, VPC, KMS and IAM services, EKS clusters & RDS Databases.
* Monitor the infra to prevent outages/downtimes and honor our infra SLAs
* Deploy and manage new infra components.
* Update and Migrate the clusters and services.
* Reducing the cloud cost by enabling/scheduling for less utilized instances.
* Collaborate with stakeholders across the organization such as experts in - product, design, engineering
* Uphold best practices in Devops/DevSecOps and Infra management with attention to security best practices
DevSecOps
* Cloud Security Assessment & Automation
* Modify existing infra to adhere to security best practices
* Perform Threat Modelling of Web/Mobile applications
* Integrate security testing tools (SAST, DAST) in to CI/CD pipelines
* Incident management and remediation - Monitoring security incidents, recovery from and remediation of the issues
* Perform frequent Vulnerabiltiy Assessments/Penetration Testing for Web, Network and Mobile applications
* Ensure the environment is compliant to CIS, NIST, PCI etc.
Here are examples of apps/features you will be supporting as a Devops/DevSecOps Engineer
* Intuitive, easy-to-use APIs for payment process.
* Integrations with local payment gateways in international markets.
* Dashboard to manage gateways and transactions.
* Analytics platform to provide insights

We are hiring a QA Automation Engineer with strong expertise in automation frameworks and hands-on manual testing. The role requires designing and executing test strategies, building automation scripts, identifying bugs, and ensuring the delivery of high-quality, secure, and scalable applications.
Key Responsibilities
- Design, develop, and maintain automation test scripts using Selenium, Appium, Playwright, or Cypress.
- Perform manual testing (functional, regression, smoke, UAT) for web and mobile apps.
- Execute test cases, track bugs, and document results using JIRA or similar tools.
- Conduct API testing using Postman / Rest Assured.
- Perform cross-browser and cross-platform testing to ensure compatibility.
- Collaborate with developers and product teams to reproduce and resolve defects.
- Support CI/CD pipelines with automated test integration (Jenkins, GitLab CI, GitHub Actions).
- Conduct database testing (SQL queries) for data validation.
- Contribute to performance testing using JMeter or LoadRunner.
- Ensure test documentation (test plans, test cases, defect reports, execution results) is accurate and updated.
- Work in Agile/Scrum teams, participate in sprint planning, and provide QA estimates.
- Apply security testing basics to detect vulnerabilities.
Requirements
- 3+ years work experience writing clean production code
- Well versed with maintaining infrastructure as code (Terraform, Cloudformation etc). High proficiency with Terraform / Terragrunt is absolutely critical
- Experience of setting CI/CD pipelines from scratch
- Experience with AWS(EC2, ECS, RDS, Elastic Cache etc), AWS lambda, Kubernetes, Docker, ServiceMesh
- Experience with ETL pipelines, Bigdata infra
- Understanding of common security issues
Roles / Responsibilities:
- Write terraform modules for deploying different component of infrastructure in AWS like Kubernetes, RDS, Prometheus, Grafana, Static Website
- Configure networking, autoscaling. continuous deployment, security and multiple environments
- Make sure the infrastructure is SOC2, ISO 27001 and HIPAA compliant
- Automate all the steps to provide a seamless experience to developers.

Position Overview: We are seeking a talented and motivated ReactJS Intern to join our team for a 6-month full-time internship. This is a fantastic opportunity for someone with prior internship experience to deepen their knowledge and skills in ReactJS while working on cutting-edge projects.
Responsibilities:
- Collaborate with the development team to design, develop, and maintain web applications using ReactJS.
- Write clean, maintainable, and efficient code.
- Participate in code reviews and contribute to the improvement of the codebase.
- Work closely with designers to implement user interfaces and user experiences.
- Assist in troubleshooting, debugging, and optimizing applications.
- Stay updated with the latest industry trends and technologies related to ReactJS.
- Contribute to the documentation of technical processes and workflows.
- Engage in team meetings and brainstorming sessions to contribute innovative ideas.
Requirements:
- Previous internship experience in a software development role.
- Knowledge of ReactJS and its core principles.
- Proficiency in JavaScript, HTML5, and CSS3.
- Familiarity with RESTful APIs and asynchronous request handling.
- Understanding of modern front-end build pipelines and tools.
- Experience with version control systems, preferably Git.
- Excellent problem-solving skills and attention to detail.
- Ability to work effectively in a collaborative team environment.
- Strong communication skills and the ability to articulate technical concepts clearly.
Nice to Have:
- Understanding with state management libraries like Redux or Context API.
- Knowledge of TypeScript and its integration with ReactJS.
- Familiarity with front-end testing frameworks and tools (e.g., Jest, Enzyme).
- Understanding of responsive design principles and mobile-first development.
- Experience with CSS preprocessors like SASS or LESS.
What We Offer:
- Hands-on experience with real-world projects and modern technologies.
- Mentorship and guidance from experienced professionals.
- A collaborative and inclusive work environment.
- Opportunity to contribute to meaningful projects and make a real impact.
- Competitive stipend and potential for a full-time position upon successful completion of the internship.
BDA and BDE
ROLES
We're seeking a qualified sales associate to sell annual car and bike Scotty subscription products that our customers have grown to rely on. The sales associate will utilize their skills to generate high-quality leads, build strong relationships with customers, and close deals. The ideal candidate has skills and demonstrates the ability to showcase our offerings compellingly.
Job Location: Work from home
Selection process: HR Round followed by group discussion and sales manager round.
Qualification: Any Graduate/Post Graduate
Salary offered: As per industry standards
Working days: 6 days Sunday is off
Shifts:10:00am -7:00pm
Mandatory language: English
Laptop/wi-fi: candidates are to use their own laptops,
Additional Compensation: If applicable, this will be decided on the basis of your designation.
Key responsibilities:
1. Creating an inspiring team environment with an open communication culture
2. Setting clear team goals
3. Oversee day-to-day operation
4. Monitor team performance and report
5. Motivate team members
6. Generate sales through customer referrals
7. Review the sales team performance and explore sales improvement initiatives to achieve the sales targets for the project
8. Conduct sales review meetings with the sales team on a periodic basis
9. To recruit good quality prospective sales candidate and ensure that they are completing probation in order to achieve the sales target set by the company
Job Title: Marketing Associate (Outbound)
Company: All Wave AV
Website: www.allwaveav.com
Location: Hyderabad, India
Company Overview:
All Wave AV is a premier AV Integrator headquartered in Mumbai, India, specializing in executing projects and delivering services for global enterprises across India. With over 300+ installations spanning 22+ years, our team of 90+ professionals provides professionalism and precision in all our installations. We are a full-stack System Integrator, offering design, supply, installation, and maintenance services. As members of the PSNI Global Alliance, we ensure the highest standards in delivering projects and services.
Job Description:
We are seeking a Marketing Associate (Outbound) to join our team. The ideal candidate will have a pleasant voice and the ability to prospect leads and set up appointments and demos. Previous experience in selling technology, software, or hardware solutions would be a bonus. The compensation package will be based on experience and will include a small variable component based on outcomes.
Responsibilities:
Build a quantifiable sales pipeline using the provided database.
Use a pleasant voice to communicate effectively with potential clients.
Demonstrate excellent English speaking and writing skills.
Ensure accurate and concise explanations of our solutions during phone calls and follow up with appropriate emails to decision-makers.
Assist in building a sales team while adhering to the established process.
Conduct occasional Zoom calls as required.
Work primarily from the office.
Qualifications:
Bachelor's degree.
Minimum 2 years of experience in outbound tele-calling, making 30-40 calls per day.
Strong communication skills, both verbal and written.
Ability to remain composed under pressure.
Highly organized and professional in approach.
- Drive and own sales and revenue targets from ecommerce operations for the brand marketing campaign on marketplaces and offer management
- Monitor, measure and maintain reports on performance to drive attainment of defined objectives by optimisation
- Plan and execute promotional schemes and campaigns to maximise growth
- Providing insights into latest consumer and shopping trends to identify assortment gaps & selection and plan effective branding strategies Performance marketing Inventory management Catalogue mapping ,listing
- Plan the AOP business closely with the customer and the internal teams – to help deliver the toplines at a quarterly and annual level
- Keep track of all the health and hygiene measure – Stock holding of the e-tailers in terms of the number of days as well as the quality of the stock, ensure correct PO execution, track and ensure availability of our SKUs Work on monthly offtake plan for all brands – tracking the same periodically and ensuring the right inputs are planned both on-platform and off-platform to drive business
- Manage the e-tailer PnL and ensure that we keep a close track on the profitability of the account and keep a check on the promotional expense
- Opening new platforms for the brand
- 3-4 years of handling ecommerce platforms - Amazon Flipkart,Big Basket
Total Experience: 4-7 Years
Must Have (2-3yrs):
Appium, Selenium or Protract, Any programming language, Cucumber framework
Good to Have (2-3yrs.):
Restassured, Jenkins CI/CD Integration, Database Knowledge (SQL, ORACLE etc.), Github etc

Qrata is hiring for the company which provides business tools for thecreators to set up paid communities by providing one time and subscription based services.
Quick Facts
Founded in: 2019
Number of Employees: 11 - 50 Employees
Industry: Marketing & Advertising
Headquarters: Mumbai
Technology Stack: MongoDB, Express, React, Node
- Work with development teams and product managers to ideate software solutions
- Design client-side and server-side architecture
- Build the front-end of applications through appealing visual design
- Develop and manage well-functioning databases and applications
- Write effective APIs
- Test software to ensure responsiveness and efficiency
- Troubleshoot, debug and upgrade software
- Create security and data protection settings
- Build features and applications with a mobile responsive design
- Write technical documentation
- Work with data scientists and analysts to improve software




