


Key Responsibilities:
- Data Collection and Preparation:
- Gathering data from various sources (databases, APIs, files).
- Cleaning and preprocessing data (handling missing values, outliers, inconsistencies).
- Transforming data into a suitable format for analysis.
- Data Analysis:
- Performing exploratory data analysis (EDA) to identify patterns and trends.
- Applying statistical techniques and machine learning algorithms.
- Creating data visualizations (charts, graphs) to communicate findings.
- Model Development and Evaluation:
- Assisting in the development and training of machine learning models.
- Evaluating model performance using appropriate metrics.
- Contributing to model tuning and optimization.

About Vector Labs Tech
About
Company social profiles
Similar jobs
Description:
JOB RESPONSIBILITY
• Candidate should be good in both Backend side and Front-end side as well.
• Build front end of applications through appealing visual design
• Experience in writing SQL queries is must.
• Candidate should be good at understanding, analyzing and developing the business requirements and should be able to deliver them with in the given time
• Candidate should have good problem-solving skills.
• Good knowledge on Agile and Software Development Life Cycle (SDLC) process.
• Candidate should be able to contribute at Individual level and Team Level.
• Should be able to communicate with the business team for requirements gathering and providing solutions for the same.
QUALIFICATION
• B.Tech /B.E /MCA
EXPERIENCE
• 4 + years’ experience as a full stack developer
• 5 - 7 working years of strong experience on Java 8, Angular 7 or above
SKILLS AND COMPETENCIES
• 4+ years’ experience as a full stack developer
• 6+ working years of strong experience on Java 8, Angular 7 or above
• Working knowledge on HTML5, CSS3 or above, Bootstrap.
• Strong experience on Spring MVC, Spring Core, Spring Data JPA/Hibernate/JPA, Spring Boot, Spring Transactions, Spring Security.
• Experience on Oracle Database SQL Developer or any SQL tools and should be good at writing SQL Queries
• Strong experience on developing web services (RESTFUL Web services/API’s) using Java.
• Good Communication and Teamwork skills.
• Good to have knowledge on AWS, React Js, Node Js, Java Script.
Enable Skills-Based HiringNoPrimary Skill Set Angular, Java, Microservices, Spring Boot
Job Title: QA Automation Engineer
Location: Andheri, Mumbai
Experience: 2 to 4 Years
Work Mode: 5 Days Work From Office (WFO)
Joining: Immediate Joiners Preferred
(Looking for local candidates of Mumbai)
Job Overview:
We are looking for a proactive and detail-oriented QA Automation Engineer with 2 to 4 years of experience. The ideal candidate must have solid experience in automation testing, strong command over SQL, and hands-on proficiency in API testing using Postman. This is a full-time on-site role based in Andheri, Mumbai, and we are open to candidates from diverse project backgrounds as long as the core skills are strong.
Key Responsibilities:
- Develop and maintain automated test cases for APIs and backend systems.
- Perform end-to-end API validations using Postman, including request construction, response validation, and chaining requests.
- Write and optimize SQL queries to validate business logic and data integrity in backend systems.
- Collaborate closely with developers, business analysts, and QA team members to define test requirements.
- Track, report, and follow up on bugs and issues using tools like JIRA or similar.
- Ensure timely regression, smoke, and sanity testing before every release.
- Participate in Agile ceremonies and contribute to improving testing processes and coverage.
Mandatory Skills:
- ✅ 2–4 years of experience in Automation Testing
- ✅ Strong knowledge of SQL and ability to write complex queries
- ✅ Practical, hands-on experience with Postman for API Testing
- ✅ Solid understanding of software QA methodologies, tools, and processes
- ✅ Familiarity with Agile development and continuous testing workflows
Please Apply - https://zrec.in/sEzbp?source=CareerSite
About Us
Infra360 Solutions is a services company specializing in Cloud, DevSecOps, Security, and Observability solutions. We help technology companies adapt DevOps culture in their organization by focusing on long-term DevOps roadmap. We focus on identifying technical and cultural issues in the journey of successfully implementing the DevOps practices in the organization and work with respective teams to fix issues to increase overall productivity. We also do training sessions for the developers and make them realize the importance of DevOps. We provide these services - DevOps, DevSecOps, FinOps, Cost Optimizations, CI/CD, Observability, Cloud Security, Containerization, Cloud Migration, Site Reliability, Performance Optimizations, SIEM and SecOps, Serverless automation, Well-Architected Review, MLOps, Governance, Risk & Compliance. We do assessments of technology architecture, security, governance, compliance, and DevOps maturity model for any technology company and help them optimize their cloud cost, streamline their technology architecture, and set up processes to improve the availability and reliability of their website and applications. We set up tools for monitoring, logging, and observability. We focus on bringing the DevOps culture to the organization to improve its efficiency and delivery.
Job Description
Job Title: DevOps Engineer Azure
Department: Technology
Location: Gurgaon
Work Mode: On-site
Working Hours: 10 AM - 7 PM
Terms: Permanent
Experience: 2-4 years
Education: B.Tech/MCA/BCA
Notice Period: Immediately
Infra360.io is searching for a DevOps Engineer to lead our group of IT specialists in maintaining and improving our software infrastructure. You'll collaborate with software engineers, QA engineers, and other IT pros in deploying, automating, and managing the software infrastructure. As a DevOps engineer you will also be responsible for setting up CI/CD pipelines, monitoring programs, and cloud infrastructure.
Below is a detailed description of the roles and responsibilities, expectations for the role.
Tech Stack :
- Kubernetes: Deep understanding of Kubernetes clusters, container orchestration, and its architecture.
- Terraform: Extensive hands-on experience with Infrastructure as Code (IaC) using Terraform for managing cloud resources.
- ArgoCD: Experience in continuous deployment and using ArgoCD to maintain GitOps workflows.
- Helm: Expertise in Helm for managing Kubernetes applications.
- Cloud Platforms: Expertise in AWS, GCP or Azure will be an added advantage.
- Debugging and Troubleshooting: The DevOps Engineer must be proficient in identifying and resolving complex issues in a distributed environment, ranging from networking issues to misconfigurations in infrastructure or application components.
Key Responsibilities:
- CI/CD and configuration management
- Doing RCA of production issues and providing resolution
- Setting up failover, DR, backups, logging, monitoring, and alerting
- Containerizing different applications on the Kubernetes platform
- Capacity planning of different environment's infrastructure
- Ensuring zero outages of critical services
- Database administration of SQL and NoSQL databases
- Infrastructure as a code (IaC)
- Keeping the cost of the infrastructure to the minimum
- Setting up the right set of security measures
- CI/CD and configuration management
- Doing RCA of production issues and providing resolution
- Setting up failover, DR, backups, logging, monitoring, and alerting
- Containerizing different applications on the Kubernetes platform
- Capacity planning of different environment's infrastructure
- Ensuring zero outages of critical services
- Database administration of SQL and NoSQL databases
- Infrastructure as a code (IaC)
- Keeping the cost of the infrastructure to the minimum
- Setting up the right set of security measures
Ideal Candidate Profile:
- A graduation/post-graduation degree in Computer Science and related fields
- 2-4 years of strong DevOps experience with the Linux environment.
- Strong interest in working in our tech stack
- Excellent communication skills
- Worked with minimal supervision and love to work as a self-starter
- Hands-on experience with at least one of the scripting languages - Bash, Python, Go etc
- Experience with version control systems like Git
- Strong experience of Amazon Web Services (EC2, RDS, VPC, S3, Route53, IAM etc.)
- Strong experience with managing the Production Systems day in and day out
- Experience in finding issues in different layers of architecture in production environment and fixing them
- Knowledge of SQL and NoSQL databases, ElasticSearch, Solr etc.
- Knowledge of Networking, Firewalls, load balancers, Nginx, Apache etc.
- Experience in automation tools like Ansible/SaltStack and Jenkins
- Experience in Docker/Kubernetes platform and managing OpenStack (desirable)
- Experience with Hashicorp tools i.e. Vault, Vagrant, Terraform, Consul, VirtualBox etc. (desirable)
- Experience with managing/mentoring small team of 2-3 people (desirable)
- Experience in Monitoring tools like Prometheus/Grafana/Elastic APM.
- Experience in logging tools Like ELK/Lo


- Partnering with internal business owners (product, marketing, edit, etc.) to understand needs and develop custom analysis to optimize for user engagement and retention
- Good understanding of the underlying business and workings of cross functional teams for successful execution
- Design and develop analyses based on business requirement needs and challenges.
- Leveraging statistical analysis on consumer research and data mining projects, including segmentation, clustering, factor analysis, multivariate regression, predictive modeling, etc.
- Providing statistical analysis on custom research projects and consult on A/B testing and other statistical analysis as needed. Other reports and custom analysis as required.
- Identify and use appropriate investigative and analytical technologies to interpret and verify results.
- Apply and learn a wide variety of tools and languages to achieve results
- Use best practices to develop statistical and/ or machine learning techniques to build models that address business needs.
Requirements
- 2 - 4 years of relevant experience in Data science.
- Preferred education: Bachelor's degree in a technical field or equivalent experience.
- Experience in advanced analytics, model building, statistical modeling, optimization, and machine learning algorithms.
- Machine Learning Algorithms: Crystal clear understanding, coding, implementation, error analysis, model tuning knowledge on Linear Regression, Logistic Regression, SVM, shallow Neural Networks, clustering, Decision Trees, Random forest, XGBoost, Recommender Systems, ARIMA and Anomaly Detection. Feature selection, hyper parameters tuning, model selection and error analysis, boosting and ensemble methods.
- Strong with programming languages like Python and data processing using SQL or equivalent and ability to experiment with newer open source tools.
- Experience in normalizing data to ensure it is homogeneous and consistently formatted to enable sorting, query and analysis.
- Experience designing, developing, implementing and maintaining a database and programs to manage data analysis efforts.
- Experience with big data and cloud computing viz. Spark, Hadoop (MapReduce, PIG, HIVE).
- Experience in risk and credit score domains preferred.
Manage end to end recruitment for internal hiring requirements from understanding the role, creating a JD ,job posting, filtering and screening of candidates, conducting interviews, coordination with candidates, verification and joining formalities.

The present role is a Data engineer role for Crewscale– Toplyne Collaboration.
Crewscale is exclusive partner of Toplyne.
About Crewscale:
Crewscale is a premium technology company focusing on helping companies building world
class scalable products. We are a product based start-up having a code assessment platform
which is being used top technology disrupters across the world.
Crewscale works with premium product companies (Indian and International) like - Swiggy,
ShareChat Grab, Capillary, Uber, Workspan, Ovo and many more. We are responsible for
managing infrastructure for Swiggy as well.
We focus on building only world class tech product and our USP is building technology can
handle scale from 1 million to 1 billion hits.
We invite candidates who have a zeal to develop world class products to come and work with us.
Toplyne
Who are we? 👋
Toplyne is a global SaaS product built to help revenue teams, at businesses with a self-service motion, and a large user-base, identify which users to spend time on, when and for what outcome. Think self-service or freemium-led companies like Figma, Notion, Freshworks, and Slack. We do this by helping companies recognize signals across their - product engagement, sales, billing, and marketing data.
Founded in June 2021, Toplyne is backed by marquee investors like Sequoia,Together fund and a bunch of well known angels. You can read more about us on - https://bit.ly/ForbesToplyne" target="_blank">https://bit.ly/ForbesToplyne , https://bit.ly/YourstoryToplyne" target="_blank">https://bit.ly/YourstoryToplyne.
What will you get to work on? 🏗️
-
Design, Develop and maintain scalable data pipelines and Data warehouse to support continuing increases in data volume and complexity.
-
Develop and implement processes and systems to supervise data quality, data mining and ensuring production data is always accurate and available for key partners and business processes that depend on it.
-
Perform data analysis required to solve data related issues and assist in the resolution of data issues.
-
Complete ownership - You’ll build highly scalable platforms and services that support rapidly growing data needs in Toplyne. There’s no instruction book, it’s yours to write. You’ll figure it out, ship it, and iterate.
What do we expect from you? 🙌🏻
-
3-6 years of relevant work experience in a Data Engineering role.
-
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
-
Experience building and optimising data pipelines, architectures and data sets.
-
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
-
Strong analytic skills related to working with unstructured datasets.
-
Good understanding of Airflow, Spark, NoSql databases, Kakfa is nice to have.
Java_FullstackDeveloper_DGT II
Required Skills:
Desired Skills:
|
- Design collaterals for our performance marketing team across multiple display networks, social media, and native platforms.
- Execute our visual strategy and calendar to be deployed across all digital channels.
- Work with our copywriter to conceptualize compelling designs that increase the brand's overall reach, engagement, and conversions to sales.
- Work on design collaterals like social media creatives, videos, infographics, blog & amp; web banners, emails, landing pages, etc.
- Optimize designs and visual language to match the relevant channel requirements but also maintain our overall brand tone.
- Monitor and implement relevant digital design trends to keep our brand communication engaging and up-to-date with industry standards.
- 3 - 5 years of experience designing digital content and marketing collaterals, with a good portfolio of work.
- Has a clean visual design sense with good conceptual, typography, grid, and illustration skills.
- Animation and video editing skills will be an added advantage, but not mandatory.
- Can work independently and in a team to manage multiple projects and aggressive deadlines.
- Ability to work in a fast-paced environment with high volumes.
- Exceptional attention-to-detail.
- Must be proficient in Adobe Creative Suite.
- Understand competitive and best-in-class design practices for digital content.
- Ability to uncover digital design trends and implement them wherever required.
We are looking for 2+ Year experienced Back-end developer to join our Team.
You will be responsible for the development and management of server side of our web applications in terms of quality and scalability.
You should be passionate about writing optimised codes, and solving problems in real time, Data Structure and Algorithm.
Responsibilities :
- Write clean code to develop functional web applications
- Should be adaptable to good practices and standards
- Capable to write scalable codes.
- Gather and address technical and design requirements
- Build reusable code and libraries for future use
- Follow emerging technologies
Required Candidate profile
Requirements :
- Hands on experience on Node Js, Typescript, Mongodb, Elastic Search, Kafka, RESTfull Apis, Python(Django), Angular, Flutter / Dart
- Excellent analytical and time management skills
- Teamwork skills with a problem-solving attitude
- Understanding of deployment process with AWS/Digital Ocean/Microsoft Azure
- Good Understanding of System Design.
- Should be aware of design patterns
- Understanding of TDD
- Unit Test Frameworks - Chai, Mocha, Sinon, Supertest, Knock
What we are looking for
- We are looking for a candidate with around 2 years of experience in Backend Development. We are looking for a team player with a very good attitude.

