- Data Scientist with 4+ yrs of experience
- Good working experience in Computer vision and ML engineering
- Strong knowledge of statistical modeling, hypothesis testing, and regression analysis
- Should be developing APIs
- Proficiency in Python, SQL
- Should have Azure knowledge
- Basic knowledge of NLP
- Analytical thinking and problem-solving abilities
- Excellent communication, Strong collaboration skills
- Should be able to work independently
- Attention to detail and commitment to high-quality results
- Adaptability to fast-paced, dynamic environments

About Wallero technologies
Similar jobs
Position: Senior Backend Developer (Python + FastAPI + GCP)
Engagement Type: Remote
Location: Remote
Position Overview:
Seeking a Senior Backend Developer with 6–8 years of experience to design, build, and maintain scalable backend systems using Python and FastAPI on Google Cloud Platform, ensuring high performance, reliability, and secure deployments.
Key Responsibilities:
• Design and build scalable, high-performance backend systems using Python (FastAPI) with clean architecture and modular design principles.
• Architect and implement microservices and RESTful APIs ensuring reliability, security, and optimal response times.
• Own end-to-end system design from schema definition to deployment with focus on high availability, fault tolerance, and cost efficiency.
• Manage DevOps pipelines for CI/CD, containerization, and infrastructure automation using Docker, Terraform, and GitHub Actions.
• Deploy, monitor, and optimize cloud infrastructure on Google Cloud Platform (GCP) including Compute Engine, Cloud Run, Pub/Sub, and Cloud Storage.
• Collaborate with frontend, AI, and data engineering teams to define robust API contracts and efficient data flows.
• Implement observability practices including logging, tracing, and alerting using Stackdriver, Prometheus, or Grafana.
• Conduct code reviews, system design sessions, and performance tuning for production-grade deployments.
Required Skills:
• Strong expertise in Python with a focus on FastAPI.
• Solid understanding of software architecture and design patterns.
• Experience with API design and authentication (OAuth2, JWT).
• Deep knowledge of Google Cloud Platform (GCP) services including Compute Engine, Pub/Sub, Cloud SQL, Cloud Storage, and IAM policies.
• Proficiency in containerization (Docker) and CI/CD pipelines.
• Familiarity with infrastructure-as-code tools such as Terraform or Cloud Build.
• Hands-on experience with databases like PostgreSQL, MySQL, or Firestore and ORM frameworks such as SQLAlchemy or Tortoise ORM.
• Knowledge of system monitoring, alerting, and logging in distributed systems.
Preferred (Bonus) Skills:
• Experience with API Gateway, Redis, or Celery for async processing.
• Familiarity with message queues like Kafka or RabbitMQ.
• Exposure to event-driven architecture and cloud cost optimization strategies.
• Understanding of DevSecOps and cloud security best practices.
• Experience working with cross-functional product and data teams.
- Assist in identifying potential clients and generating new business leads.
- Support the sales team in preparing proposals, presentations, and sales materials.
- Conduct market research to identify trends and opportunities.
- Maintain and update the CRM system with accurate client information.
- Participate in client meetings and calls to understand customer needs.
- Collaborate with marketing and product teams to align sales strategies.
- Provide administrative support to the sales team as needed.
Backend Developer Job Description
Job Title: Backend Developer
Location: Nipania, Indore
Job Type: Full-time
Experience Level: Mid-Senior Level
Role Overview:
We are seeking an experienced Backend Developer to join our dynamic team. The ideal candidate will be responsible for building robust server-side applications and services. You will work closely with frontend developers to ensure seamless integration between the server-side logic and user-facing features.
Responsibilities:
- Implement efficient and secure backend services using Node.js and Python.
- Design and implement data storage solutions using MongoDB.
- Write effective APIs to support frontend functionalities.
- Optimize applications for performance, security, and scalability.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Maintain cloud services on AWS, including AWS EC2, Amplify, S3, Route 53, Cloudfront, Lambda & AWS CLI..
- Troubleshoot and debug issues across the backend stack.
Requirements:
- Proficient understanding of the MERN stack (MongoDB, Express.js, React.js, Node.js).
- Good experience using Docker and cloud services (AWS or Digital Ocean).
- Familiarity with microservices architecture and testing APIs with Postman.
- Knowledge of modern authorization mechanisms (e.g., JWT).
- Experience creating and maintaining CI/CD pipelines is a plus.
- Strong problem-solving skills with the ability to debug complex issues.
- Experience or knowledge of GraphQL is a plus.
Benefits:
- Leave encashment
- Provident Fund
Schedule:
- Day shift
- Fixed shift
Location- 840, Hare Krishna Vihar, Nipania, Indore, Madhya Pradesh 452010
Responsibilities:
.Approaching potential leads and closing them within a stipulated time frame.
.Create exclusive distributors, exclusive super stockists and business associates for the company towards business growth.
.Guide, coordinate and make strategic marketing plans for the sales team working under his jurisdiction.
.Finding and developing new markets and improving sales.
.Keep abreast with stock reports, inventory, product orders, re-order etc.
.Training personnel and helping team members in developing their skills.
Requirements:
.Bachelor's degree in business, marketing or a related field.
.Experience in sales, marketing or related field.
.Strong communication skills.
.Excellent organizational skills.
.Proficient in Word, Excel, Outlook, and Powerpoint.
- S/MS degree in Computer Science, Engineering or a related subject
- Proven working experience in software development
- Working experience in iOS development
- Have published one or more iOS apps in the app store
- A deep familiarity with Objective-C and Cocoa Touch
- Experience working with iOS frameworks such as Core Data, Core Animation, Core Graphics and Core Text
- Experience with third-party libraries and APIs
- Working knowledge of the general mobile landscape, architectures, trends, and emerging technologies
- Solid understanding of the full mobile development life cycle
EXPERIENCE:3-10years
LOCATIONS:Bangalore,Chennai and Kerela
Must have:
•Designing and developing user interfaces using Angular best practices.
•Making complex technical and design decisions for Angular projects.
•Mandatory skills (Javascript,Typescript and Angular)
Company Description
At Bungee Tech, we help retailers and brands meet customers everywhere and, on every occasion, they are in. We believe that accurate, high-quality data matched with compelling market insights empowers retailers and brands to keep their customers at the center of all innovation and value they are delivering.
We provide a clear and complete omnichannel picture of their competitive landscape to retailers and brands. We collect billions of data points every day and multiple times in a day from publicly available sources. Using high-quality extraction, we uncover detailed information on products or services, which we automatically match, and then proactively track for price, promotion, and availability. Plus, anything we do not match helps to identify a new assortment opportunity.
Empowered with this unrivalled intelligence, we unlock compelling analytics and insights that once blended with verified partner data from trusted sources such as Nielsen, paints a complete, consolidated picture of the competitive landscape.
We are looking for a Big Data Engineer who will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
You will also be responsible for integrating them with the architecture used in the company.
We're working on the future. If you are seeking an environment where you can drive innovation, If you want to apply state-of-the-art software technologies to solve real world problems, If you want the satisfaction of providing visible benefit to end-users in an iterative fast paced environment, this is your opportunity.
Responsibilities
As an experienced member of the team, in this role, you will:
- Contribute to evolving the technical direction of analytical Systems and play a critical role their design and development
- You will research, design and code, troubleshoot and support. What you create is also what you own.
- Develop the next generation of automation tools for monitoring and measuring data quality, with associated user interfaces.
- Be able to broaden your technical skills and work in an environment that thrives on creativity, efficient execution, and product innovation.
BASIC QUALIFICATIONS
- Bachelor’s degree or higher in an analytical area such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.
- 5+ years relevant professional experience in Data Engineering and Business Intelligence
- 5+ years in with Advanced SQL (analytical functions), ETL, Data Warehousing.
- Strong knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures, data modeling and performance tuning.
- Ability to effectively communicate with both business and technical teams.
- Excellent coding skills in Java, Python, C++, or equivalent object-oriented programming language
- Understanding of relational and non-relational databases and basic SQL
- Proficiency with at least one of these scripting languages: Perl / Python / Ruby / shell script
PREFERRED QUALIFICATIONS
- Experience with building data pipelines from application databases.
- Experience with AWS services - S3, Redshift, Spectrum, EMR, Glue, Athena, ELK etc.
- Experience working with Data Lakes.
- Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
- Sharp problem solving skills and ability to resolve ambiguous requirements
- Experience on working with Big Data
- Knowledge and experience on working with Hive and the Hadoop ecosystem
- Knowledge of Spark
- Experience working with Data Science teams
Looking candidated from service base or service division of any company.
Minimum Qualification:
- Hands-on working on Java ( {Language understanding - Java 8, Lambdas, Collections, popular frameworks & libraries}, JVM, GC tuning, performance tuning)
- Worked on REST frameworks/libraries like Spring MVC, Spring Boot, Dropwizard, REST Express etc
- Worked on Relational data stores viz. MySQL, Oracle or Postgres
- Worked on Non-relational data stores viz. Cassandra, HBase, Couchbase, MongoDB etc
- Worked on caching infra viz. Redis, Memcached, Aerospike, Riak etc
- Worked on Queueing infra viz. Kafka, RabbitMQ, ActiveMQ etc
The position requires for an individual to Develop high-volume, low-latency application for data
analytics for big consumer product corporations. The position also requires the candidate to Contribute
in all phases of the development lifecycle, write well designed, testable, efficient code. Should ensure
designs follow specifications.
An ideal candidate will be/have:
• Strong experience in Python/JAVA.
• Familiarity with Test driven development and Continuous Integration.
• Strong knowledge and hands-on with code development tools (Eclipse, GIT, Jenkins, Unit, Testing Frameworks).
• Familiar with Software development methodology like Agile methodology.
• Ability to write complex SQL.
• Desire to learn and develop new tools and techniques and share with the team
• Knowledge of cloud would be a plus
• Ability to design software modules.









