● Design, Develop and execute automated test scripts for web and mobile applications
● Develop and execute test plans, test cases and test reports
● Develop data-driven, modular automation scripts that can be re-used with minimal additions.
● Design and run performance and load tests to measure conformance to performance
benchmarks.
● Work with developers to identify and resolve problems.
● Ensure that test environments are set up accurately.
● Work with testing staff mentoring them on test automation.
● Take responsibility for own and team goals. Gives adequate direction to junior testing staff.
Qualifications
● Bachelor degree in Computer Science or similar degree
● 4 - 6 years of experience with software testing in an n-tiered architecture deployed in large scale
environments.
● Experience developing test automation for both web and mobile applications using Selenium,
Appium, Cucumber and BDD
● Knowledge of open source frameworks such as Cypress, TestProjet.io etc is good to have
● Experience with any programming language such as Python, JavaScript or Java
● Experience with Load and Performance testing
● Experience working with database environments.
● Processes strong verbal and written communication skills

About Careator Technologies Pvt Ltd
About
Careator Technology is an end-to-end service provider for all your Application Development, Recruitment, and Staffing related requirements. Our primary competitive advantage is our ability to provide customized solutions that cover the whole process from beginning to finish. Careator maintains a broad variety of productive partnerships with organizations of the highest tier, and the firm ensures that each of its commercial associates is provided with an effective solution.
The founders of Careator understand that the combination of people and technology is essential to the success of a business. The company places a high premium on its employees and their relationships. This is reflected in the way that it creates its operational and strategic models, how it responds to consumers, how it connects with people, and how it promotes diversity in the workplace and the community.
Connect with the team
Similar jobs
Strong Data engineer profile
Mandatory (Experience 1): Must have 6 months+ of hands-on Data Engineering experience.
Mandatory (Experience 2): Must have end-to-end experience in building & maintaining ETL/ELT pipelines (not just BI/reporting).
Mandatory (Technical): Must have strong SQL capability

Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Skill Examples:
- Proficiency in SQL Python or other programming languages used for data manipulation.
- Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
- Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
- Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
- Experience in performance tuning.
- Experience in data warehouse design and cost improvements.
- Apply and optimize data models for efficient storage retrieval and processing of large datasets.
- Communicate and explain design/development aspects to customers.
- Estimate time and resource requirements for developing/debugging features/components.
- Participate in RFP responses and solutioning.
- Mentor team members and guide them in relevant upskilling and certification.
Knowledge Examples:
- Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
- Proficient in SQL for analytics and windowing functions.
- Understanding of data schemas and models.
- Familiarity with domain-related data.
- Knowledge of data warehouse optimization techniques.
- Understanding of data security concepts.
- Awareness of patterns frameworks and automation practices.
Additional Comments:
# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026
Project Overview:
Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.
The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.
Design, build, and maintain scalable data pipelines using Databricks and PySpark.
Develop and optimize complex SQL queries for data extraction, transformation, and analysis.
Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).
Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.
Ensure data quality, performance, and reliability across data workflows.
Participate in code reviews, data architecture discussions, and performance optimization initiatives.
Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.
Key Skills:
Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).
Excellent problem-solving, communication, and collaboration skills.
Skills: Databricks, Pyspark & Python, Sql, Aws Services
Must-Haves
Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)
Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).
******
Notice period - Immediate to 15 days
Location: Bangalore
office karol bagh. As discussed regarding hiring, sharing you Job description,
1. Required only experience and confident Female candidate
2. Post - Senior tele caller
3. Fixed salary of Rs. 15000/- per month + benefit
4. No target
5. Inbound calls
Kindly go through and to the needful.
office karol bagh
Responsibilities:
- Collaborate with stakeholders to understand business objectives, processes, and requirements.
- Conduct thorough analysis of business processes, workflows, and systems to identify areas for improvement.
- Gather, document, and prioritize business requirements using techniques such as interviews, workshops, and surveys.
- Analyze data to identify trends, patterns, and insights that can inform decision-making and drive business outcomes.
- Develop and maintain documentation, including business process models, use cases, user stories, and requirements specifications.
- Work closely with cross-functional teams to ensure alignment between business needs and technical solutions.
- Facilitate meetings, workshops, and presentations to communicate findings, solicit feedback, and drive consensus among stakeholders.
- Collaborate with developers, designers, and QA analysts to ensure successful implementation and testing of solutions.
- Monitor and evaluate the performance of implemented solutions, and recommend adjustments or enhancements as needed.
- Stay updated on industry trends, best practices, and emerging technologies related to business analysis and process improvement.
Requirements:
- Bachelor's degree in Business Administration, Management Information Systems, or related field.
- Proven experience as a Business Analyst or similar role.
- Strong analytical skills, with the ability to gather and interpret data from multiple sources.
- Proficiency in business analysis techniques, such as process modeling, data analysis, and requirements elicitation.
- Excellent communication skills, including the ability to effectively communicate technical concepts to non-technical stakeholders.
- Solid understanding of project management principles and methodologies.
- Experience with business analysis tools and software (e.g., Microsoft Visio, Jira, Confluence) is a plus.
- Ability to work independently and collaboratively in a fast-paced environment.
- Strong organizational skills and attention to detail.
- Certification in business analysis (e.g., CBAP, PMI-PBA) is preferred but not required.
Job description
About Kofluence:
Kofluence is a start-up positioned as “Adwords for Social Media Influencer Marketing''. Led by Online gaming leaders/ IIM Alumni, Kofluence has developed a technology platform where brands can reach out to their audience through micro-influencers. At Kofluence, you get an opportunity to create stuff from scratch. So, if you see yourself as curious, ambitious, innovative and perennially hungry for growth, this is the place to be!
Overview:
We are looking for the position of Account Manager. Selected person will take end to end ownership of accounts to ensure brands collaborate with Kofluence consistently delivering value and exceeding expectations.
As a BD - Account Manager,
You Get:
• Amazing colleagues to work with
• A great office with a cool culture
• Work-from-Home option
• The freedom to do things and take decisions with logical reasoning
Your Responsibilities:
- Complete ownership of new business development process from strategy to execution
• Own and deliver quarterly/annual order booking target
• Create and execute brand communication campaigns, build & strengthen the brand's equity over a period of time and drive adoption rates for the brand
• Create and leverage synergies across partners in overall ecosystem to maximize and sustain revenues
• Own and lead end to end Sales process from lead generation, lead qualification, Proposal submission, Contract Negotiation and Closure
Need to be available from 7 AM to 9 AM IST everyday
Need extensive experience in React & Angular 8+ versions
Prior knowledge of migration experience is a must from Angular 8+ to React
We are looking for a DevOps Engineer for managing the interchange of data between the server and the users. Your primary responsibility will be the development of all server-side logic, definition, and maintenance of the central database, and ensuring high performance and responsiveness to request from the frontend. You will also be responsible for integrating the front-end elements built by your co-workers into the application. Therefore, a basic understanding of frontend technologies is necessary as well.
What we are looking for
- Must have strong knowledge of Kubernetes and Helm3
- Should have previous experience in Dockerizing the applications.
- Should be able to automate manual tasks using Shell or Python
- Should have good working knowledge on AWS and GCP clouds
- Should have previous experience working on Bitbucket, Github, or any other VCS.
- Must be able to write Jenkins Pipelines and have working knowledge on GitOps and ArgoCD.
- Have hands-on experience in Proactive monitoring using tools like NewRelic, Prometheus, Grafana, Fluentbit, etc.
- Should have a good understanding of ELK Stack.
- Exposure on Jira, confluence, and Sprints.
What you will do:
- Mentor junior Devops engineers and improve the team’s bar
- Primary owner of tech best practices, tech processes, DevOps initiatives, and timelines
- Oversight of all server environments, from Dev through Production.
- Responsible for the automation and configuration management
- Provides stable environments for quality delivery
- Assist with day-to-day issue management.
- Take lead in containerising microservices
- Develop deployment strategies that allow DevOps engineers to successfully deploy code in any environment.
- Enables the automation of CI/CD
- Implement dashboard to monitors various
- 1-3 years of experience in DevOps
- Experience in setting up front end best practices
- Working in high growth startups
- Ownership and Be Proactive.
- Mentorship & upskilling mindset.
- systems and applications
what you’ll get- Health Benefits
- Innovation-driven culture
- Smart and fun team to work with
- Friends for life
Responsibilities
- Integration of user-facing elements developed by front-end developers
- Build efficient, testable, and reusable PHP modules
- Solve complex performance problems and architectural challenges
- Integration of data storage solutions.
Skills And Qualifications
- Strong knowledge of PHP web frameworks (such as Laravel, Yii, CodeIgniter)
- Understanding the fully synchronous behavior of PHP
- Understanding of MVC design patterns
- Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
- Knowledge of object - oriented PHP programming
Minimum of 4+ years of experience in Java development
- Experience delivering Services (REST, SOAP) and Web applications in Micro services architecture
- Experience developing and deploying Java solutions to cloud
- Experience in Spring Boot and components of Spring framework
- Experience in a JavaScript framework such as Angular or React
- Experience in TDD using Junit or similar frameworks
· Experience in Design Patterns and service oriented architectural principles, Data structures and Algorithms.
· Individual should be an active participant in the product design and code reviews for self and team and can competently review any aspect of their product or major sub-system.
· Experience in SQL, Unix skills.
· Good communication Skill









