11+ Beautiful Soup Jobs in Chennai | Beautiful Soup Job openings in Chennai
Apply to 11+ Beautiful Soup Jobs in Chennai on CutShort.io. Explore the latest Beautiful Soup Job opportunities across top companies like Google, Amazon & Adobe.
Role : Web Scraping Engineer
Experience : 2 to 3 Years
Job Location : Chennai
About OJ Commerce:
OJ Commerce (OJC), a rapidly expanding and profitable online retailer, is headquartered in Florida, USA, with a fully-functional office in Chennai, India. We deliver exceptional value to our customers by harnessing cutting-edge technology, fostering innovation, and establishing strategic brand partnerships to enable a seamless, enjoyable shopping experience featuring high-quality products at unbeatable prices. Our advanced, data-driven system streamlines operations with minimal human intervention.
Our extensive product portfolio encompasses over a million SKUs and more than 2,500 brands across eight primary categories. With a robust presence on major platforms such as Amazon, Walmart, Wayfair, Home Depot, and eBay, we directly serve consumers in the United States.
As we continue to forge new partner relationships, our flagship website, www.ojcommerce.com, has rapidly emerged as a top-performing e-commerce channel, catering to millions of customers annually.
Job Summary:
We are seeking a Web Scraping Engineer and Data Extraction Specialist who will play a crucial role in our data acquisition and management processes. The ideal candidate will be proficient in developing and maintaining efficient web crawlers capable of extracting data from large websites and storing it in a database. Strong expertise in Python, web crawling, and data extraction, along with familiarity with popular crawling tools and modules, is essential. Additionally, the candidate should demonstrate the ability to effectively utilize API tools for testing and retrieving data from various sources. Join our team and contribute to our data-driven success!
Responsibilities:
- Develop and maintain web crawlers in Python.
- Crawl large websites and extract data.
- Store data in a database.
- Analyze and report on data.
- Work with other engineers to develop and improve our web crawling infrastructure.
- Stay up to date on the latest crawling tools and techniques.
Required Skills and Qualifications:
- Bachelor's degree in computer science or a related field.
- 2-3 years of experience with Python and web crawling.
- Familiarity with tools / modules such as
- Scrapy, Selenium, Requests, Beautiful Soup etc.
- API tools such as Postman or equivalent.
- Working knowledge of SQL.
- Experience with web crawling and data extraction.
- Strong problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Excellent communication and documentation skills.
What we Offer
• Competitive salary
• Medical Benefits/Accident Cover
• Flexi Office Working Hours
• Fast paced start up
SENIOR DATA ENGINEER:
ROLE SUMMARY:
Own the design and delivery of petabyte-scale data platforms and pipelines across AWS and modern Lakehouse stacks. You’ll architect, code, test, optimize, and operate ingestion, transformation, storage, and serving layers. This role requires autonomy, strong engineering judgment, and partnership with project managers, infrastructure teams, testers, and customer architects to land secure, cost-efficient, and high-performing solutions.
RESPONSIBILITIES:
- Architecture and design: Create HLD/LLD/SAD, source–target mappings, data contracts, and optimal designs aligned to requirements.
- Pipeline development: Build and test robust ETL/ELT for batch, micro-batch, and streaming across RDBMS, flat files, APIs, and event sources.
- Performance and cost tuning: Profile and optimize jobs, right-size infrastructure, and model license/compute/storage costs.
- Data modeling and storage: Design schemas and SCD strategies; manage relational, NoSQL, data lakes, Delta Lakes, and Lakehouse tables.
- DevOps and release: Establish coding standards, templates, CI/CD, configuration management, and monitored release processes.
- Quality and reliability: Define DQ rules and lineage; implement SLA tracking, failure detection, RCA, and proactive defect mitigation.
- Security and governance: Enforce IAM best practices, retention, audit/compliance; implement PII detection and masking.
- Orchestration: Schedule and govern pipelines with Airflow and serverless event-driven patterns.
- Stakeholder collaboration: Clarify requirements, present design options, conduct demos, and finalize architectures with customer teams.
- Leadership: Mentor engineers, set FAST goals, drive upskilling and certifications, and support module delivery and sprint planning.
REQUIRED QUALIFICATIONS:
- Experience: 15+ years designing distributed systems at petabyte scale; 10+ years building data lakes and multi-source ingestion.
- Cloud (AWS): IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail.
- Programming: Python (preferred), PySpark, SQL for analytics, window functions, and performance tuning.
- ETL tools: AWS Glue, Informatica, Databricks, GCP DataProc; orchestration with Airflow.
- Lakehouse/warehousing: Snowflake, BigQuery, Delta Lake/Lakehouse; schema design, partitioning, clustering, performance optimization.
- DevOps/IaC: Terraform with 15+ years of practice; CI/CD (GitHub Actions, Jenkins) with 10+ years; config governance and release management.
- Serverless and events: Design event-driven distributed systems on AWS.
- NoSQL: 2–3 years with DocumentDB including data modeling and performance considerations.
- AI services: AWS Entity Resolution, AWS Comprehend; run custom LLMs on Amazon SageMaker; use LLMs for PII classification.
NICE-TO-HAVE QUALIFICATIONS:
- Data governance automation: 10+ years defining audit, compliance, retention standards and automating governance workflows.
- Table and file formats: Apache Parquet; Apache Iceberg as analytical table format.
- Advanced LLM workflows: RAG and agentic patterns over proprietary data; re-ranking with index/vector store results.
- Multi-cloud exposure: Azure ADF/ADLS, GCP Dataflow/DataProc; FinOps practices for cross-cloud cost control.
OUTCOMES AND MEASURES:
- Engineering excellence: Adherence to processes, standards, and SLAs; reduced defects and non-compliance; fewer recurring issues.
- Efficiency: Faster run times and lower resource consumption with documented cost models and performance baselines.
- Operational reliability: Faster detection, response, and resolution of failures; quick turnaround on production bugs; strong release success.
- Data quality and security: High DQ pass rates, robust lineage, minimal security incidents, and audit readiness.
- Team and customer impact: On-time milestones, clear communication, effective demos, improved satisfaction, and completed certifications/training.
LOCATION AND SCHEDULE:
● Location: Outside US (OUS).
● Schedule: Minimum 6 hours of overlap with US time zones.
Skills: Python,robot Framework,Automation Testing,Selenium
Requirement:
- B.E/B.Tech Degree from a reputed institution with at least 4 years of relevant experience.
- Hands-on experience with test automation using Python or Java.
- Experience with test frameworks such as Robot Framework or TestNG.
- Traffic generation tools usage such as curl-loader or JMeter.
- QA engineers for Networking Technology products such as switches, routers, L4-L7 products testing can also apply.
- Knowledge of public clouds, AWS, Azure or GCP, is desired.
- Working knowledge in Kubernetes, Docker or Openshift environments required.
- Hands on experience with Linux.
- Strong problem solving and debugging abilities.
- Familiarity with continuous integration tools such as Jenkins or CircleCI.
- Interest in machine learning (ML) and data science is a plus.
- Communicating with customers, making outbound calls to potential customers and following up on leads.
- Understanding customers' needs and identifying sales opportunities.
- Answering potential customers' questions and sending additional information per email.
- Keeping up with product and service information and updates.
- Creating and maintaining a database of current and potential customers.
- Explaining and demonstrating features of products and services.
- Staying informed about competing products and services.
- Upselling products and services.
- Researching and qualifying new leads.
- Closing sales and achieving sales targets.
We are seeking extremely smart & independent senior developers who are gung ho about building large scale systems that are going to have a big impact on millions of customers. You will:
Responsibilities:
- Lead design and development of products working closely with business team.
- Independently own software components and co-own entire applications with a small group of fellow developers.
- Formally mentor junior software engineers on the team, reviewing design documents, (peer) reviewing code, providing design direction and guidance.
- Build performant, scalable, yet secure, enterprise ready back end architectures that can support millions of users in parallel.
- Establish strong engineering best practices and champion their adoption.
Requirements:
- 4+ years' experience in software product development and delivery.
- Bachelors or Masters degree in engineering (preferably computer science or sister branches) from a reputed institute (preferably IITs, NITs, or other top engineering institutes).
- Strong grasp of CS fundamentals, algorithms and excellent problem-solving abilities.
- All experience should be from good product development or e-commerce background.
- Able to take ownership of working with at least one of mobile or web app teams for complete integration with backend.
- Must have shown good stability in all your previous associations.
- Have strong backend knowledge and cloud development exposure.
- Proficiency in Java, Spring boot, Hibernate, REST API development.
- Worked with at least 1 RDBMS (Mysql preferred).
- NoSql is a plus.
- Have used or are very hands-on with Microservices, Docker, Kubernetes, Gradle/Ant, Kafka, GIT/bitbucket in a very agile work place.
- Writing high quality code which is made better by unit tests and integration tests is how you work.
- Comfortable with exploring proven open source tech stack like Grafana, Kibana, Jira, Prometheus, caches like Redis/Memcached, task queues like celery, to name a few.
- Knowledge of test driven development and AWS tech stack will be a good plus.
- Solid background and proven experience in the tech stacks
- Strong proficiency in JavaScript
- Proficiency in HTML, CSS, JavaScript and jQuery
- Familiar with JavaScript frameworks such as jQuery, ReactJS
- Understanding and Work experience in ES6, Redux, Regex, MaterialUI and MUIdatatables is a plus
- Should be familiar with REST API’s
- Excellent communication and team handling skills
- Great attention to detail and problem-solving skills
Ésah Tea is a brand devoted to giving you the ultimate tea experience. We believe that joy and warmth can be shared in many forms but is best when shared with a cup of tea. Our tea is made with absolute care and honesty by the best growers in Assam who believe in only giving the best.
Our tea is the perfect harmony of aesthetics, flavor, complexity, and health. Our mission to give you the best tea experience as a result of the respect we have for the art and craft of tea making and its culture. Directly from the tea gardens of Assam, from the most authentic growers with finesse in the cultivation of tea, our mission is to give you the best experience. We care about our environment and believe in sustainable and ethical extraction and production. Our initiative to switch to plastic-free teabags is a result of adopting sustainability as an ethic.
Ésah Tea recently raised a Pre Series A round from a well-known VC and has some milestones that the company would like to achieve. As it is well known in the eCommerce industry, achieving higher milestones month after month, year after year is a standard. Hence, Ésah Tea is looking to build a core team that can take it towards the next stage of its eCommerce journey. So if you think that you are perfect to work in a Startup, Let's have a chat.
The Job
- Manage paid acquisition channels like Google Ads and Facebook Ads. Analyze the results and optimize individual campaigns.
- Be responsible for the ROI of dozens of campaigns you execute.
- Own, drive, and report on crucial marketing efficiency metrics such as CAC, LTV, and, ROAS
- Understand your target audience and customer behaviors. You create strategies and campaigns, understanding the entire user journey from account creation until purchase.
- Analyze creative data and make recommendations to the Creative designers
- Come up with a hypothesis and test it scientifically to consolidate learnings, thereby increasing the baseline of the team's quality.
- Document findings in a structured way to contribute to the company's global knowledge base
Collaborate with international peers in other geographical markets to keep the highest possible baseline quality in your campaigns.
The Ideal Candidate
- 2-4 years of paid marketing experience, preferably in consumer-facing products
- Fluent in English (Advanced writing and speaking English skills)
- Experience with creative experimentation
- Experience with fast-paced, high growth startups
- Strong analytical and problem-solving skills
- A natural strategist
- Makes fast business decisions that prove to be profitable
Nice to have
- SQL
- Understanding of ad platforms algorithms
Experience with programmatic media (DSP)
Note: Must Be willing to relocate to Guwahati, Assam
- Designing and building mobile applications for Apple’s iOS platform.
- Collaborating with the design team to define app features.
- Ensuring quality and performance of the application to specifications.
- Identifying potential problems and resolving application bottlenecks.
- Fixing application bugs before the final release.
- Publishing application on App Store.
- Maintaining the code and atomization of the application.
- Designing and implementing application updates.
We're hiring Fullstack - React.Js and Node.js developers to join our team in UK! You will be responsible for building fluid and responsive user interfaces for multiple platforms and devices.
We are looking for experienced engineers who have an appetite for solving complex problems and build seamless user interactions to world-class standards.
Requirements
You’re great at
- Strong experience in React.js & Node.js
- Graph.ql
- Creating testable code and make testing a priority;
- Agile methodologies such as Scrum and Kanban;
- Being passionate, self-driven and working with little supervision towards a common team or company purpose
- Writing and consuming RESTful interfaces
- Vanilla Javascript and modern frameworks and platforms, Node.js experience is essential;
- AWS
- Serverless
- Developing great user interfaces
It also would be cool if you
- Know module bundlers and task runners such as Webpack, Parcel, Rollup, Browserify, Grunt or Gulp;
- Have experience with Test Automation
- Have experience in databases, both SQL and document-based
- Use state management like Redux, MobX and/or other;
- Use of CSS pre/post-processors like PostCSS, Styled Components, LESS, SASS and/or others
Benefits
We offer
- Flexible working
- Competitive salary
- Annual bonus, subject to company performance
- Access to Udemy online training and opportunities to learn and grow within the role
- Good Healthcare Benefits
- Enhanced Maternity, paternity, shared parental and adoption leave and pay
- Autonomous working
This is for our Chennai Office , however work from home can be considered during pandemic or lockdowns .
- Java / JavaScript Developer Job Description (Minimum 2 years Experience in Java)
- Good in Core Java, Multithreading, J2EE, web services (REST, SOAP).
- In-depth knowledge on Java Script and object-oriented technique .
- Should be able to work independently
- Familiar on Front-end technologies like AngularJS framework, HTML5, CSS3 is a real Plus



