About Sagacito Technologies
Principal Accountabilities :
1. Good in communication and converting business requirements to functional requirements
2. Develop data-driven insights and machine learning models to identify and extract facts from sales, supply chain and operational data
3. Sound Knowledge and experience in statistical and data mining techniques: Regression, Random Forest, Boosting Trees, Time Series Forecasting, etc.
5. Experience in SOTA Deep Learning techniques to solve NLP problems.
6. End-to-end data collection, model development and testing, and integration into production environments.
7. Build and prototype analysis pipelines iteratively to provide insights at scale.
8. Experience in querying different data sources
9. Partner with developers and business teams for the business-oriented decisions
10. Looking for someone who dares to move on even when the path is not clear and be creative to overcome challenges in the data.
● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.
Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.
Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.
- Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
- Lead a team of ML engineers in developing an industrial AI product from scratch
- Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
- Tune the models to achieve high accuracy rates and minimum latency
- Deploying developed computer vision models on edge devices after optimization to meet customer requirements
- Bachelor’s degree
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- 4+ years of industrial experience in computer vision and/or deep learning
- Experience in taking an AI product from scratch to commercial deployment.
- Experience in Image enhancement, object detection, image segmentation, image classification algorithms
- Experience in deployment with OpenVINO, ONNXruntime and TensorRT
- Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
- Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
- Proficient understanding of code versioning tools, such as Git
Our perfect candidate is someone that:
- is proactive and an independent problem solver
- is a constant learner. We are a fast growing start-up. We want you to grow with us!
- is a team player and good communicator
What We Offer:
- You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
- You will be in charge of what you build and be an integral part of the product development process
- Technical and financial growth!
We have an Excellent job Opportunity for "Applied Machine Learning Engineer" with one of thr Product based organization for Remote Working Mode or for Mumbai Location.
- Apply your knowledge of ML and statistics to conceptualise, experiment, develop & deploy machine learning & deep learning systems.
- Understanding the business objectives & defining the right target metrics to track performance & progress.
- Defining & building datasets with the appropriate representation techniques for learning.
- Training & tuning models. Running evaluation & test experiments on the models.
- Build ML pipelines end to end. (Everything MLOps.)
- Building pipelines for the various stages.
- Deploying models.
- Troubleshooting issues with models in production.
- Reporting results of model performance in production.
- Retraining, performance logging & maintenance.
- Help the business with insights for better decision-making. You will build many predictive models for internal business operations
you will derive insights from the trained models & data to help the product & business teams make better decisions.
- 2+ years of work experience as an ML engineer or Data Scientist with a Bachelors Degree in Computer science or related field
- Theoretical & practical knowledge of Machine Learning, Deep Learning and Statistical methods. (NLP Tasks, Recommender Systems, Predictive Modelling etc)
- Since Pepper is a content company, you will work on many interesting text based problems. Solid understanding of Natural Language Processing techniques with Deep Learning is a must for this role.
- Familiarity with the popular NLP applications and text representation architectures & techniques: text classification, machine translation, named entity recognition, summarisation, question answering, zero-shot learning etc. Bag of Words, TF-IDF, Word2vec, GloVe, BERT, ELMo, GPT etc.
- Experience with ML frameworks (like Tensorflow, Keras, PyTorch) & libraries like Sklearn.
- Experience with ML infrastructure & shipping models.
- Excellent programming & algorithmic skills. Good understanding of Data Structures and algorithms (fluent in at least one object oriented programming language). Proficiency in Python is a must.
- Strong understanding of database systems & schema design. Proficient in SQL
Please let us know if you are interested in the above opening and if interested please let us know your
Current CTC :
Expected CTC :
Notice Period :
Relevant experience in Machine Learning :
Relevant experience in Deep Learning:
Relevant experience in NLP Applications:
Must have experience on e-commerce projects
We are looking for
A Senior Software Development Engineer (SDE2) who will be instrumental in the design and development of our backend technology, which manages our exhaustive data pipelines and AI models. Simplifying complexity and building technology that is robust and scalable is your North Star. You'll work closely alongside our CTO and machine learning engineers, frontend and wider technical team to build new capabilities, focused on speed and reliability.
You'll own your work, to build, test and iterate quickly, with direct guidance from our CTO.
Please note: You must have proven industry experience greater than 2 years.
Your work includes
- Own and manage the whole engineering infrastructure that supports Greendeck platform.
- Work to create highly scalable, highly robust and highly available python micro-services.
- Design the architecture to stream data on a huge scale across multiple services.
- Create and manage data pipelines using tools like Kafka, Celery.
- Deploy Serverless functions to process and manage data.
- Work with variety of databases and storage systems to store and strategically manage data.
- Write connections to collect data from various third party services, data storages and APIs.
- Strong experience in python creating scripts or apps or services
- Strong automation and scripting skills
- Knowledge of at least one SQL and No-SQL Database
- Experience of working with messaging systems like Kafka, RabbitMQ
- Good knowledge about data-frames and data-manipulation
- Have used and deployed apps using FastAPI or Flask or similar tech
- Knowledge of CI/CD paradigm
- Basic knowledge about Docker
- Have knowledge of creating and using REST APIs
- Good knowledge of OOP Fundamentals.
- (Optional) Knowledge about Celery/ Airflow
- (Optional) Knowledge about Lambda/ Serverless
- (Optional) Have connected apps using OAuth
What you can expect
- Attractive pay, bonus scheme and flexible vacation policy.
- A truly flexible, trust-based, performance driven work culture.
- Lunch is on us, everyday!
- A young and passionate team building elegant products with intricate technology for the future of businesses around the world. Our average age is 25!
- The chance to make a huge difference to the success of a world-class SaaS product and the opportunity to make an impact.
Its important to us
- That you relocate to Indore
- That you have a minimum of 2 years of experience working as a Software Developer
As a Senior Engineer - Big Data Analytics, you will help the architectural design and development for Healthcare Platforms, Products, Services, and Tools to deliver the vision of the Company. You will significantly contribute to engineering, technology, and platform architecture. This will be done through innovation and collaboration with engineering teams and related business functions. This is a critical, highly visible role within the company that has the potential to drive significant business impact.
The scope of this role will include strong technical contribution in the development and delivery of Big Data Analytics Cloud Platform, Products and Services in collaboration with execution and strategic partners.
- Design & develop, operate, and drive scalable, resilient, and cloud native Big Data Analytics platform to address the business requirements
- Help drive technology transformation to achieve business transformation, through the creation of the Healthcare Analytics Data Cloud that will help Change establish a leadership position in healthcare data & analytics in the industry
- Help in successful implementation of Analytics as a Service
- Ensure Platforms and Services meet SLA requirements
- Be a significant contributor and partner in the development and execution of the Enterprise Technology Strategy
- At least 2 years of experience software development for big data analytics, and cloud. At least 5 years of experience in software development
- Experience working with High Performance Distributed Computing Systems in public and private cloud environments
- Understands big data open-source eco-systems and its players. Contribution to open source is a strong plus
- Experience with Spark, Spark Streaming, Hadoop, AWS/Azure, NoSQL Databases, In-Memory caches, distributed computing, Kafka, OLAP stores, etc.
- Have successful track record of creating working Big Data stack that aligned with business needs, and delivered timely enterprise class products
- Experience with delivering and managing scale of Operating Environment
- Experience with Big Data/Micro Service based Systems, SaaS, PaaS, and Architectures
- Experience Developing Systems in Java, Python, Unix
- BSCS, BSEE or equivalent, MSCS preferred
About the Company:
This opportunity is for an AI Drone Technology startup funded by the Indian Army. It is working to develop cutting-edge products to help the Indian Army gain an edge in New Age Enemy Warfare.
They are working on using drones to neutralize terrorists hidden in deep forests. Get a chance to contribute to secure our borders against the enemy.
- Extensive knowledge in machine learning and deep learning techniques
- Solid background in image processing/computer vision
- Experience in building datasets for computer vision tasks
- Experience working with and creating data structures/architectures
- Proficiency in at least one major machine learning framework such as Tensorflow, Pytorch
- Experience visualizing data to stakeholders
- Ability to analyze and debug complex algorithms
- Highly skilled in Python scripting language
- Creativity and curiosity for solving highly complex problems
- Excellent communication and collaboration skills
MS in Engineering, Applied Mathematics, Data Science, Computer Science or equivalent field, with 3 years industry experience, a PhD degree or equivalent industry experience.