Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.
- Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
- Lead a team of ML engineers in developing an industrial AI product from scratch
- Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
- Tune the models to achieve high accuracy rates and minimum latency
- Deploying developed computer vision models on edge devices after optimization to meet customer requirements
- Bachelor’s degree
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- 4+ years of industrial experience in computer vision and/or deep learning
- Experience in taking an AI product from scratch to commercial deployment.
- Experience in Image enhancement, object detection, image segmentation, image classification algorithms
- Experience in deployment with OpenVINO, ONNXruntime and TensorRT
- Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
- Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
- Proficient understanding of code versioning tools, such as Git
Our perfect candidate is someone that:
- is proactive and an independent problem solver
- is a constant learner. We are a fast growing start-up. We want you to grow with us!
- is a team player and good communicator
What We Offer:
- You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
- You will be in charge of what you build and be an integral part of the product development process
- Technical and financial growth!
Numadic is hiring a Data engineer
We are Numads
Drawn to the unknown, we are new age nomads who seek to bring near what is far. We work as full stack humans, able to operate independently while enjoying the journey together. We see past the sandlines of clan and craft and value the unique and special talents of each. We think, we design, we code, we write, we share, we care and we ride together. We aim to live by our values of Humility, Collaboration and Transformation
We undisrupt vehicle payments
To impact a highly fragmented v-commerce space, we aim to bring order to simplify & aggregate. We are a full stack v-commerce platform. We build the Network side of the products to achieve dense on-ground digital coverage by working with & aggregating different types of partners. Further help set the standards for scaling sustainably for the future. We also build the User side of the products to make road travel experience for our vehicle owners and drivers contactless and fully autonomous.
How you'll make an impact
- Apply advanced predictive modeling and statistical techniques to design, build, maintain, and improve upon multiple real-time decision systems.
- Visualize and show complex data-sets via multidimensional visualization tools.
- Perform data cleansing, transformation & feature engineering.
- Design scalable automated data mining, modelling and validation processes.
- Produce scalable, reusable, efficient feature code to be implemented on clusters and standalone data servers.
- Contribute to the development/ deployment of machine learning algorithms, operational research, semantic analysis, and statistical methods for finding structure in large data sets.
- Knowledge & working proficiency in Python, SQL
- Deep understanding about data structures and algorithms.
- Bias for action - Ability to move quickly while taking time out to review the details.
- Clear communicator - Ability to synthesise and clearly articulate complex information, highlighting key takeaways and actionable insights.
- Team player - Working mostly autonomously, yet being a team player keeping your crews looped-in.
- Mindset - Ability to take responsibility for your life and that of your people and projects.
- Mindfulness - Ability to maintain practices that keep you grounded.
Position: Data Scientist
Experience: 5+ Years
- Design thinking to really understand the business problem
- Understanding new ways to deliver (agile, DT)
- Being able to do a functional design across S/4HANA and SCP). An understanding of the possibilities around automation/RPA (which should include UIPath, Blueprism, Contextor) and how these can be identified and embedded in business processes
- Following on from this, the same is true for AI and ML: What is available in SAP standard, how can these be enhanced/developed further, how these technologies can be embedded in the business process. There is no point in understanding the standard process, or the AI and ML components, we will need a new type of hybrid SAP practitioner.
At-least 1 year of Python, Spark, SQL, data engineering experience
Primary Skillset: PySpark, Scala/Python/Spark, Azure Synapse, S3, RedShift/Snowflake
Relevant Experience: Legacy ETL job Migration to AWS Glue / Python & Spark combination
Reverse engineer the existing/legacy ETL jobs
Create the workflow diagrams and review the logic diagrams with Tech Leads
Write equivalent logic in Python & Spark
Unit test the Glue jobs and certify the data loads before passing to system testing
Follow the best practices, enable appropriate audit & control mechanism
Analytically skillful, identify the root causes quickly and efficiently debug issues
Take ownership of the deliverables and support the deployments
Create data pipelines for data integration into Cloud stacks eg. Azure Synapse
Code data processing jobs in Azure Synapse Analytics, Python, and Spark
Experience in dealing with structured, semi-structured, and unstructured data in batch and real-time environments.
Should be able to process .json, .parquet and .avro files
Tier1/2 candidates from IIT/NIT/IIITs
However, relevant experience, learning attitude takes precedence
GCP Data Analyst profile must have below skills sets :
- Knowledge of programming languages like SQL , Oracle, R, MATLAB, Java and Python
- Data cleansing, data visualization, data wrangling
- Data modeling , data warehouse concepts
- Adapt to Big data platform like Hadoop, Spark for stream & batch processing
- GCP (Cloud Dataproc, Cloud Dataflow, Cloud Datalab, Cloud Dataprep, BigQuery, Cloud Datastore, Cloud Datafusion, Auto ML etc)
Required Skills: GSM, Linux, Networking, infrastructure, VOIP, PHP, Perl, SIP, troubleshooting, Python,HTML
- 5+ years of solid experience as an Asterisk Developer
- Must be proficient at developing software using a scripting language. A distinct advantage if you have experience using Python
- Sound knowledge of Asterisk Installation, Configuration, Dialplan, Call troubleshooting (SIP, ISDN)
- Solid experience of working With Linux Operating Systems
- Good understanding of VoIP, SIP, SS7.
- Banking Domain
- Assist the team in building Machine learning/AI/Analytics models on open-source stack using Python and the Azure cloud stack.
- Be part of the internal data science team at fragma data - that provides data science consultation to large organizations such as Banks, e-commerce Cos, Social Media companies etc on their scalable AI/ML needs on the cloud and help build POCs, and develop Production ready solutions.
- Candidates will be provided with opportunities for training and professional certifications on the job in these areas - Azure Machine learning services, Microsoft Customer Insights, Spark, Chatbots, DataBricks, NoSQL databases etc.
- Assist the team in conducting AI demos, talks, and workshops occasionally to large audiences of senior stakeholders in the industry.
- Work on large enterprise scale projects end-to-end, involving domain specific projects across banking, finance, ecommerce, social media etc.
- Keen interest to learn new technologies and latest developments and apply them to projects assigned.
- Professional Hands-on coding experience in python for over 1 year for Data scientist, and over 3 years for Sr Data Scientist.
- This is primarily a programming/development-
oriented role - hence strong programming skills in writing object-oriented and modular code in python and experience of pushing projects to production is important.
- Strong foundational knowledge and professional experience in
- Machine learning, (Compulsory)
- Deep Learning (Compulsory)
- Strong knowledge of At least One of : Natural Language Processing or Computer Vision or Speech Processing or Business Analytics
- Understanding of Database technologies and SQL. (Compulsory)
- Knowledge of the following Frameworks:
- Scikit-learn (Compulsory)
- Keras/tensorflow/pytorch (At least one of these is Compulsory)
- API development in python for ML models (good to have)
- Excellent communication skills.
- Excellent communication skills are necessary to succeed in this role, as this is a role with high external visibility, and with multiple opportunities to present data science results to a large external audience that will include external VPs, Directors, CXOs etc.
- Hence communication skills will be a key consideration in the selection process.
The candidate must have Expertise in ADF(Azure data factory), well versed with python.
Performance optimization of scripts (code) and Productionizing of code (SQL, Pandas, Python or PySpark, etc.)
Bachelors in - in Computer Science, Data Science, Computer Engineering, IT or equivalent
Fluency in Python (Pandas), PySpark, SQL, or similar
Azure data factory experience (min 12 months)
Able to write efficient code using traditional, OO concepts, modular programming following the SDLC process.
Experience in production optimization and end-to-end performance tracing (technical root cause analysis)
Ability to work independently with demonstrated experience in project or program management
Azure experience ability to translate data scientist code in Python and make it efficient (production) for cloud deployment
Amazon Web Services (AWS) is carrying on that tradition while leading the world in Cloud technologies. As a member of the AWS Professional Services team you will be at the forefront of this transformational technology assisting a global list of companies that are taking advantage of a growing set of services and features to run their mission-critical applications.
Professional Services engage in a wide variety of projects for customers and partners, providing collective experience from across the AWS customer base and are obsessed about strong success for the Customer. Our team collaborates across the entire AWS organization to bring access to product and service teams, to get the right solution delivered and drive feature innovation based upon customer needs. You will collaborate with our customers and/or partners on key engagements and will develop and deliver proof-of-concept projects, technical workshops, feature comparisons, and execute migration projects.
You will be based in Hyderabad and might have to travel globally to ensure customer success.
- Employ customer facing skills to represent AWS well within the customer's environment and drive discussions with technical and business teams.
- As a key member of the team, ensure success in designing, building and migrating applications, software, and services on the AWS platform
- Participate in architectural discussions and design exercises to create large scale solutions built on AWS and also be part of the development lifecycle.
- Identity workarounds for specific issues and corner scenarios observed during migration
- Automate solutions for repeatable problems
- Develop test plan and testcases to demonstrate application/database readiness post migration
- Work closely with application teams to ensure business functionality and SLAs are met
- Consult for optimal design of database environments, analyzing complex distributed production deployments, and making recommendations to optimize performance
- Develop innovative solutions to complex business and technology problems
- Educate customers on the value proposition of AWS and AWS services
- Partner with the sales team to design solutions for customers that drive AWS adoption and revenue
- Conduct technical sessions for internal teams, partners and customers
BASIC QUALIFICATIONS :
- 12+ years of experience in a technical position.
- 4+ years on any Cloud Platform (AWS, Azure, Google etc).
- Bachelor's degree in Information Science / Information Technology, Computer Science, Engineering, Mathematics, Physics, or a related field.
- Strong verbal and written communication skills, with the ability to work effectively across internal and external organizations.
- Strong programming skills in Java and/or Python.
- Strong hands-on experience in integrating multiple databases like Oracle, SQL Server, PostgreSQL etc.
- Deep hands-on experience in the design, development and deployment of business software at scale.
- Customer facing skills to represent AWS well within the customer's environment and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation
- Leading/Involved in highly-available and fault-tolerant enterprise and web-scale software applications.
- Experience in performance optimization techniques.
- High end Troubleshooting and Communication skills.
- Proven experience with software development life cycle (SDLC) and agile/iterative methodologies required
PREFERRED QUALIFICATIONS :
- Implementing experience with primary AWS services (EC2, ELB, RDS, Lambda, API Gateway Route53 & S3).
- AWS Solutions Architect Certified
- Infrastructure automation through DevOps scripting (E.g. shell, Python, Ruby, Powershell, Perl)
- Configuration management using CloudFormation and/or Chef/Puppet
- Experience in database programming like PL/SQL etc.
- Demonstrated ability to think strategically about business, product, and technical challenges
- Integration of AWS cloud services with on-premise technologies from Microsoft, IBM, Oracle, HP, SAP etc.
- Experience with IT compliance and risk management requirements (eg. security, privacy, SOX, HIPAA etc.).
- Extended travel to customer locations may be required to sell and deliver professional services as needed
Amazon is an equal opportunity employer. Amazon or its Recruitment Partners do not charge any fee or security deposit from the candidate for offering employment.
Should have excellent problem solving and programming skills in Python/Java.
Strong interpersonal, communication and analytical skills
Should have the ability to express their design ideas and thoughts.
Should have the zeal and adaptability to learn new technology frameworks.
Should have passed out in 2020 or is passing out in 2020 and have consistently scored above 75% or CGPA of 8.