50+ Data Science Jobs in India
Apply to 50+ Data Science Jobs on CutShort.io. Find your next job, effortlessly. Browse Data Science Jobs and apply today!
About the company
DCB Bank is a new generation private sector bank with 442 branches across India.It is a scheduled commercial bank regulated by the Reserve Bank of India. DCB Bank’s business segments are Retail banking, Micro SME, SME, mid-Corporate, Agriculture, Government, Public Sector, Indian Banks, Co-operative Banks and Non-Banking Finance Companies.
Job Description
Department: Risk Analytics
CTC: Max 18 Lacs
Grade: Sr Manager/AVP
Experience: Min 4 years of relevant experience
We are looking for a Data Scientist to join our growing team of Data Science experts and manage the processes and people responsible for accurate data collection, processing, modelling, analysis, implementation, and maintenance.
Responsibilities
- Understand, monitor and maintain existing financial scorecards (ML Based) and make changes to the model when required.
- Perform Statistical analysis in R and assist IT team with deployment of ML model and analytical frameworks in Python.
- Should be able to handle multiple tasks and must know how to prioritize the work.
- Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities.
- Develop clear, concise and actionable solutions and recommendations for client’s business needs and actively explore client’s business and formulate solutions/ideas which can help client in terms of efficient cost cutting or in achieving growth/revenue/profitability targets faster.
- Build, develop and maintain data models, reporting systems, data automation systems, dashboards and performance metrics support that support key business decisions.
- Design and build technical processes to address business issues.
- Oversee the design and delivery of reports and insights that analyse business functions and key operations and performance metrics.
- Manage and optimize processes for data intake, validation, mining, and engineering as well as modelling, visualization, and communication deliverables.
- Communicate results and business impacts of insight initiatives to the Management of the company.
Requirements
- Industry knowledge
- 4 years or more of experience in financial services industry particularly retail credit industry is a must.
- Candidate should have either worked in banking sector (banks/ HFC/ NBFC) or consulting organizations serving these clients.
- Experience in credit risk model building such as application scorecards, behaviour scorecards, and/ or collection scorecards.
- Experience in portfolio monitoring, model monitoring, model calibration
- Knowledge of ECL/ Basel preferred.
- Educational qualification: Advanced degree in finance, mathematics, econometrics, or engineering.
- Technical knowledge: Strong data handling skills in databases such as SQL and Hadoop. Knowledge with data visualization tools, such as SAS VI/Tableau/PowerBI is preferred.
- Expertise in either R or Python; SAS knowledge will be plus.
Soft skills:
- Ability to quickly adapt to the analytical tools and development approaches used within DCB Bank
- Ability to multi-task good communication and team working skills.
- Ability to manage day-to-day written and verbal communication with relevant stakeholders.
- Ability to think strategically and make changes to data when required.
AI Product Manager
About Us:
At Formi, we’re not just riding the wave of innovation—we’re the ones creating it. Our AI-powered solutions are revolutionizing how businesses operate, pushing boundaries, and opening up new realms of possibility. We’re building the future of intelligent operations, and whether you join us or not, we’re going to change the game. But with you on board? We’ll do it faster.
About the Role:
We’re looking for a Product Manager who’s not afraid to take risks, who questions everything, and who can transform bold, crazy ideas into products that reshape entire industries. You won’t just manage; you’ll build, you’ll innovate, and you’ll lead. The best ideas? They come from people who think differently. If that’s you, let’s get started.
Your Mission (If You’re Ready for It):
We’re not hiring you for the usual. This is not a role for anyone looking to play it safe. We need someone who’s ready to challenge the status quo and make radical changes. You’ll be building the future with us, so expect to be pushed—and to push back. We’re not here to settle for less.
Key Responsibilities:
- Vision and Strategy: Forget business as usual. Your job is to craft strategies that disrupt markets and create competitive advantages that others can only dream of.
- Cross-Functional Leadership: You’ll lead from the front, working closely with engineering, sales, and marketing to turn ideas into reality. From concept to launch, you’re the one guiding the ship.
- Customer-Centric Innovation: Don’t just listen to customers—understand them at a deeper level. Then use that understanding to create products that make people say, “Why didn’t anyone think of this sooner?”
- End-to-End Ownership: You own it all. From the first spark of an idea to delivering a product that changes the game.
- Data-Driven Decision Making: We don’t guess here. Use data, trends, and insights to make decisions that propel our products forward. Let the numbers tell the story.
- Stakeholder Communication: Communicate like a visionary. Show the people around you the path forward and how it all ties together.
- Continuous Innovation: Complacency is the enemy. Keep pushing the envelope, iterating, and improving so we stay miles ahead of the competition.
Who Should Apply?
- You’re not just a thinker, you’re a doer. You thrive in fast-paced, innovative environments where the status quo is meant to be broken.
- You’ve got at least 2+ years of experience in product management, preferably in tech or SaaS, and you’ve led teams to deliver products that actually make an impact.
- You’re data-obsessed. Your decisions are backed by analytics, and you know how to measure success.
- You’re not just a communicator; you’re an inspirer. You can articulate a vision so clearly, people can’t help but rally behind it.
This Role is NOT for You If:
- You’re just looking for another job.
- You think “good enough” is, well, good enough.
- You’d rather play it safe than take bold, calculated risks.
Why Join Us?
- Impact: You’ll be part of a company that’s literally changing how businesses operate. Your work will have a direct impact on the future of creating a surplus economy.
- Growth: We’re scaling fast, and with that comes insane opportunities for personal and professional development.
- Culture: We move fast, we break things (intelligently), and we value big ideas. We’re building something massive, and you’ll be a key part of it.
Are you ready to disrupt entire industries and make a dent in the universe?
Apply now and let’s build the future together.
We are seeking a highly skilled and experienced Senior Data Analyst to join our team. In this role, you will be responsible for analyzing complex datasets, generating actionable insights, and supporting strategic decision-making processes. You will collaborate with cross-functional teams, develop data models, and create compelling reports and visualizations to communicate findings effectively.
Key Responsibilities:
- Data Analysis: Perform in-depth analysis of large datasets to identify trends, patterns, and correlations. Utilize statistical methods and data mining techniques to extract actionable insights.
- Reporting & Visualization: Develop and deliver comprehensive reports and interactive dashboards that effectively communicate data insights and business performance to stakeholders.
- Data Modeling: Build and maintain data models that support business operations and decision-making processes. Conduct data validation and ensure accuracy and consistency.
- Business Intelligence: Collaborate with business leaders and teams to understand their data needs and translate them into analytical requirements. Provide data-driven recommendations to support strategic initiatives.
Client based at Bangalore location.
Data Science:
• Python expert level, Analytical, Different models works, Basic concepts, CPG(Domain).
• Statistical Models & Hypothesis , Testing
• Machine Learning Important
• Business Understanding, visualization in Python.
• Classification, clustering and regression
•
Mandatory Skills
• Data Science, Python, Machine Learning, Statistical Models, Classification, clustering and regression
About the Role
We are actively seeking talented Senior Python Developers to join our ambitious team dedicated to pushing the frontiers of AI technology. This opportunity is tailored for professionals who thrive on developing innovative solutions and who aspire to be at the forefront of AI advancements. You will work with different companies in the US who are looking to develop both commercial and research AI solutions.
Required Skills:
- Write effective Python code to tackle complex issues
- Use business sense and analytical abilities to glean valuable insights from public databases
- Clearly express the reasoning and logic when writing code in Jupyter notebooks or other suitable mediums
- Extensive experience working with Python
- Proficiency with the language's syntax and conventions
- Previous experience tackling algorithmic problems
- Nice to have some prior Software Quality Assurance and Test Planning experience
- Excellent spoken and written English communication skills
The ideal candidates should be able to
- Clearly explain their strategies for problem-solving.
- Design practical solutions in code.
- Develop test cases to validate their solutions.
- Debug and refine their solutions for improvement.
- Engage with client business team managers and leaders independently to understand their requirements, help them structure their needs into data needs, prepare functional and technical specifications for execution and ensure delivery from the data team. This can be combination of ETL Processes, Reporting Tools, Analytics tools like SAS, R and alike.
- Lead and manage the Business Analytics team, ensuring effective execution of projects and initiatives.
- Develop and implement analytics strategies to support business objectives and drive data-driven decision-making.
- Analyze complex data sets to provide actionable insights that improve business performance.
- Collaborate with other departments to identify opportunities for process improvements and implement data-driven solutions.
- Oversee the development, maintenance, and enhancement of dashboards, reports, and analytical tools.
- Stay updated with the latest industry trends and technologies in analytics and data
- science.
About the Company :
Nextgen Ai Technologies is at the forefront of innovation in artificial intelligence, specializing in developing cutting-edge AI solutions that transform industries. We are committed to pushing the boundaries of AI technology to solve complex challenges and drive business success.
Currently offering "Data Science Internship" for 2 months.
Data Science Projects details In which Intern’s Will Work :
Project 01 : Image Caption Generator Project in Python
Project 02 : Credit Card Fraud Detection Project
Project 03 : Movie Recommendation System
Project 04 : Customer Segmentation
Project 05 : Brain Tumor Detection with Data Science
Eligibility
A PC or Laptop with decent internet speed.
Good understanding of English language.
Any Graduate with a desire to become a web developer. Freshers are welcomed.
Knowledge of HTML, CSS and JavaScript is a plus but NOT mandatory.
Fresher are welcomed. You will get proper training also, so don't hesitate to apply if you don't have any coding background.
#please note that THIS IS AN INTERNSHIP , NOT A JOB.
We recruit permanent employees from inside our interns only (if needed).
Duration : 02 Months
MODE: Work From Home (Online)
Responsibilities
Manage reports and sales leads in salesforce.com, CRM.
Develop content, manage design, and user access to SharePoint sites for customers and employees.
Build data driven reports, store procedures, query optimization using SQL and PL/SQL knowledge.
Learned the essentials to C++ and Java to refine code and build the exterior layer of web pages.
Configure and load xml data for the BVT tests.
Set up a GitHub page.
Develop spark scripts by using Scala shell as per requirements.
Develop and A/B test improvements to business survey questions on iOS.
Deploy statistical models to various company data streams using Linux shells.
Create monthly performance-base client billing reports using MySQL and NoSQL databases.
Utilize Hadoop and MapReduce to generate dynamic queries and extract data from HDFS.
Create source code utilizing JavaScript and PHP language to make web pages functional.
Excellent problem-solving skills and the ability to work independently or as part of a team.
Effective communication skills to convey complex technical concepts.
Benefits
Internship Certificate
Letter of recommendation
Stipend Performance Based
Part time work from home (2-3 Hrs per day)
5 days a week, Fully Flexible Shift
Data engineers:
Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights.This would also include develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity
Constructing infrastructure for efficient ETL processes from various sources and storage systems.
Collaborating closely with Product Managers and Business Managers to design technical solutions aligned with business requirements.
Leading the implementation of algorithms and prototypes to transform raw data into useful information.
Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations.
Creating innovative data validation methods and data analysis tools.
Ensuring compliance with data governance and security policies.
Interpreting data trends and patterns to establish operational alerts.
Developing analytical tools, utilities, and reporting mechanisms.
Conducting complex data analysis and presenting results effectively.
Preparing data for prescriptive and predictive modeling.
Continuously exploring opportunities to enhance data quality and reliability.
Applying strong programming and problem-solving skills to develop scalable solutions.
Writes unit/integration tests, contributes towards documentation work
Must have ....
6 to 8 years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines.
High proficiency in Scala/Java/ Python API frameworks/ Swagger and Spark for applied large-scale data processing.
Expertise with big data technologies, API development (Flask,,including Spark, Data Lake, Delta Lake, and Hive.
Solid understanding of batch and streaming data processing techniques.
Proficient knowledge of the Data Lifecycle Management process, including data collection, access, use, storage, transfer, and deletion.
Expert-level ability to write complex, optimized SQL queries across extensive data volumes.
Experience with RDBMS and OLAP databases like MySQL, Redshift.
Familiarity with Agile methodologies.
Obsession for service observability, instrumentation, monitoring, and alerting.
Knowledge or experience in architectural best practices for building data pipelines.
Good to Have:
Passion for testing strategy, problem-solving, and continuous learning.
Willingness to acquire new skills and knowledge.
Possess a product/engineering mindset to drive impactful data solutions.
Experience working in distributed environments with teams scattered geographically.
Title:- Data Scientist
Experience:-6 years
Work Mode:- Onsite
Primary Skills:- Data Science, SQL, Python, Data Modelling, Azure, AWS, Banking Domain (BFSI/NBFC)
Qualification:- Any
Roles & Responsibilities:-
1. Acquiring, cleaning, and preprocessing raw data for analysis.
2. Utilizing statistical methods and tools for analyzing and interpreting complex datasets.
3. Developing and implementing machine learning models for predictive analysis.
4. Creating visualizations to effectively communicate insights to both technical and non-technical stakeholders.
5. Collaborating with cross-functional teams, including data engineers, business analysts, and domain experts.
6. Evaluating and optimizing the performance of machine learning models for accuracy and efficiency.
7. Identifying patterns and trends within data to inform business decision-making.
8. Staying updated on the latest advancements in data science, machine learning, and relevant technologies.
Requirement:-
1. Experience with modeling techniques such as Linear Regression, clustering, and classification techniques.
2. Must have a passion for data, structured or unstructured. 0.6 – 5 years of hands-on experience with Python and SQL is a must.
3. Should have sound experience in data mining, data analysis and machine learning techniques.
4. Excellent critical thinking, verbal and written communications skills.
5. Ability and desire to work in a proactive, highly engaging, high-pressure, client service environment.
6. Good presentation skills.
We are working on AI for medical images. We need someone who can run pre trained models and also train new ones
at Blue Hex Software Private Limited
In this position, you will play a pivotal role in collaborating with our CFO, CTO, and our dedicated technical team to craft and develop cutting-edge AI-based products.
Role and Responsibilities:
- Develop and maintain Python-based software applications.
- Design and work with databases using SQL.
- Use Django, Streamlit, and front-end frameworks like Node.js and Svelte for web development.
- Create interactive data visualizations with charting libraries.
- Collaborate on scalable architecture and experimental tech. - Work with AI/ML frameworks and data analytics.
- Utilize Git, DevOps basics, and JIRA for project management. Skills and Qualifications:
- Strong Python programming
skills.
- Proficiency in OOP and SQL.
- Experience with Django, Streamlit, Node.js, and Svelte.
- Familiarity with charting libraries.
- Knowledge of AI/ML frameworks.
- Basic Git and DevOps understanding.
- Effective communication and teamwork.
Company details: We are a team of Enterprise Transformation Experts who deliver radically transforming products, solutions, and consultation services to businesses of any size. Our exceptional team of diverse and passionate individuals is united by a common mission to democratize the transformative power of AI.
Website: Blue Hex Software – AI | CRM | CXM & DATA ANALYTICS
Responsibilities
> Selecting features, building and optimizing classifiers using machine
> learning techniques
> Data mining using state-of-the-art methods
> Extending company’s data with third party sources of information when
> needed
> Enhancing data collection procedures to include information that is
> relevant for building analytic systems
> Processing, cleansing, and verifying the integrity of data used for
> analysis
> Doing ad-hoc analysis and presenting results in a clear manner
> Creating automated anomaly detection systems and constant tracking of
> its performance
Key Skills
> Hands-on experience of analysis tools like R, Advance Python
> Must Have Knowledge of statistical techniques and machine learning
> algorithms
> Artificial Intelligence
> Understanding of Text analysis- Natural Language processing (NLP)
> Knowledge on Google Cloud Platform
> Advanced Excel, PowerPoint skills
> Advanced communication (written and oral) and strong interpersonal
> skills
> Ability to work cross-culturally
> Good to have Deep Learning
> VBA and visualization tools like Tableau, PowerBI, Qliksense, Qlikview
> will be an added advantage
Are you passionate about pushing the boundaries of Artificial Intelligence and its applications in the software development lifecycle? Are you excited about building AI models that can revolutionize how developers ship, refactor, and onboard to legacy or existing applications faster? If so, Zevo.ai has the perfect opportunity for you!
As an AI Researcher/Engineer at Zevo.ai, you will play a crucial role in developing cutting-edge AI models using CodeBERT and codexGLUE to achieve our goal of providing an AI solution that supports developers throughout the sprint cycle. You will be at the forefront of research and development, harnessing the power of Natural Language Processing (NLP) and Machine Learning (ML) to revolutionize the way software development is approached.
Responsibilities:
- AI Model Development: Design, implement, and refine AI models utilizing CodeBERT and codexGLUE to comprehend codebases, facilitate code understanding, automate code refactoring, and enhance the developer onboarding process.
- Research and Innovation: Stay up-to-date with the latest advancements in NLP and ML research, identifying novel techniques and methodologies that can be applied to Zevo.ai's AI solution. Conduct experiments, perform data analysis, and propose innovative approaches to enhance model performance.
- Data Collection and Preparation: Collaborate with data engineers to identify, collect, and preprocess relevant datasets necessary for training and evaluating AI models. Ensure data quality, correctness, and proper documentation.
- Model Evaluation and Optimization: Develop robust evaluation metrics to measure the performance of AI models accurately. Continuously optimize and fine-tune models to achieve state-of-the-art results.
- Code Integration and Deployment: Work closely with software developers to integrate AI models seamlessly into Zevo.ai's platform. Ensure smooth deployment and monitor the performance of the deployed models.
- Collaboration and Teamwork: Collaborate effectively with cross-functional teams, including data scientists, software engineers, and product managers, to align AI research efforts with overall company objectives.
- Documentation: Maintain detailed and clear documentation of research findings, methodologies, and model implementations to facilitate knowledge sharing and future developments.
- Ethics and Compliance**: Ensure compliance with ethical guidelines and legal requirements related to AI model development, data privacy, and security.
Requirements
- Educational Background: Bachelor's/Master's or Ph.D. in Computer Science, Artificial Intelligence, Machine Learning, or a related field. A strong academic record with a focus on NLP and ML is highly desirable.
- Technical Expertise: Proficiency in NLP, Deep Learning, and experience with AI model development using frameworks like PyTorch or TensorFlow. Familiarity with CodeBERT and codexGLUE is a significant advantage.
- Programming Skills: Strong programming skills in Python and experience working with large-scale software projects.
- Research Experience: Proven track record of conducting research in NLP, ML, or related fields, demonstrated through publications, conference papers, or open-source contributions.
- Problem-Solving Abilities: Ability to identify and tackle complex problems related to AI model development and software engineering.
- Team Player: Excellent communication and interpersonal skills, with the ability to collaborate effectively in a team-oriented environment.
- Passion for AI: Demonstrated enthusiasm for AI and its potential to transform software development practices.
If you are eager to be at the forefront of AI research, driving innovation and impacting the software development industry, join Zevo.ai's talented team of experts as an AI Researcher/Engineer. Together, we'll shape the future of the sprint cycle and revolutionize how developers approach code understanding, refactoring, and onboarding!
An 8 year old IT Services and consulting company.
CTC Budget: 35-55LPA
Location: Hyderabad (Remote after 3 months WFO)
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
- 6 plus years of experience as a Python developer.
- Experience in web development using Python and Django Framework.
- Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
- Experience in developing User Interface using HTML, JavaScript, CSS.
- Experience in server-side templating languages including Jinja 2 and Mako
- Knowledge into Kafka and RabitMQ (GTH)
- Experience into Docker, Git and AWS
- Ability to integrate multiple data sources into a single system.
- Ability to collaborate on projects and work independently when required.
- DB (MySQL, Postgress, SQL)
Selection Process: 2-3 Interview rounds (Tech, VP, Client)
RESPONSIBILITIES:
• You will be involved in directly driving application of machine learning and AI to solve various product and business problems including ML model lifecycle management with ideation, experimentation, implementation, and maintenance.
• Your responsibilities will include enabling the team in moving forward with ML/AI solutions to optimise various components across our music streaming platforms.
• Your work would be impacting millions of users the way they consume music and podcasts, it would involve solving cold start problems, understanding user personas, optimising ranking and improving recommendations for serving relevant content to users.
• We are looking for a seasoned engineer to orchestrate our recommendations and discovery projects and also be involved in tech management within the team.
• The team of talented, passionate people in which you’ll work will include ML engineers and data scientists.
• You’ll be reporting directly to the head of the engineering and will be
instrumental in discussing and explaining the progress to top management and other stake holders.
• You’ll be expected to have regular conversations with product leads and other engineering leads to understand the requirements, and also similar and more frequent conversation with your own team.
REQUIREMENTS:
• A machine learning software engineer with a passion for working on exciting, user impacting product and business problems
• Stay updated with latest research in machine learning esp. recommender systems and audio signals
• Have taken scalable ML services to production, maintained and managed their lifecycle
• Good understanding of foundational mathematics associated with machine learning such as statistics, linear algebra, optimization, probabilistic models
Minimum Qualifications
• 13+ years of industry experience doing applied machine learning
• 5+ years of experience in tech team management
• Fluent in one or more object oriented languages like Python, C++, Java
• Knowledgeable about core CS concepts such as common data structures an algorithms
• Comfortable conducting design and code reviews
• Comfortable in formalising a product or business problem as a ML problem
• Master’s or PhD degree in Computer Science, Mathematics or related field
• Industry experience with large scale recommendation and ranking systems
• Experience in managing team of 10-15 engineers
• Hands on experience with Spark, Hive, Flask, Tensorflow, XGBoost, Airflow
We are a stealth startup focusing on AI in healthcare and are looking for software engineers to join our PoC (Proof of Concept) team. You will be a core member of the company with equity options. If you are ambitious, excited about next-generation tech and have a constant hunger to learn, we encourage you to apply.
Your responsibilities:
- Design and develop full-stack web applications for PoC.
- Implement computer vision and NLP based deep learning models.
- Participate in client meetings and refine product capabilities.
Essential requirements:
- Interested in working at startups.
- Ability to work independently.
- Experienced in developing full-stack web applications.
- Strong command over React and Python.
- Good experience with Data Science.
- Comfortable with ambiguity and frequent changes to project scope in innovation environment.
About the company: https://www.hectar.in/
Hectar Global is a financial technology company that provides a disruptive cross-border trading platform for the agricultural commodities market. Our platform utilizes machine learning and data analysis to provide insights and improve the trading experience for farmers, traders, and other stakeholders in the agricultural industry. Our mission is to bring transparency and efficiency to the agricultural commodities market, which has traditionally been fragmented and opaque. We are committed to driving innovation in the industry and providing a user-friendly and accessible platform for our customers.
Job Overview:
We are seeking a highly skilled Head of Engineering to lead the development of our innovative and disruptive cross-border trading platform at Hectar Global. This person will be responsible for spearheading all technical aspects of the project, from architecture and infrastructure to data and machine learning.
Responsibilities:
- Lead and manage a team of engineers, including hiring, training, and mentoring
- Design and implement the technical architecture of the cross-border trading platform
- Ensure the platform is scalable, efficient, and secure
- Oversee the development of data models and machine learning algorithms
- Collaborate with cross-functional teams, including product and design, to ensure the platform meets user needs and is visually appealing and easy to use
- Work with stakeholders to identify business requirements and translate them into technical solutions
- Develop and maintain technical documentation, including system specifications, design documents, and user manuals
- Keep up to date with emerging trends and technologies in software engineering, machine learning, and data science
Requirements:
- Bachelor's or Master's degree in Computer Science or related field
- 10+ years of experience in software engineering, with a focus on web applications and machine learning
- Proven track record of leading and managing a team of engineers
- Expertise in software architecture and design patterns
- Experience with data modeling and machine learning techniques
- Strong problem-solving and analytical skills
- Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams
- Experience working in an agile development environment
- Strong knowledge of front-end technologies, such as HTML, CSS, and JavaScript
- Familiarity with modern web frameworks, such as React or Angular
Preferred qualifications:
- Experience with cloud computing platforms, such as AWS or Azure
- Familiarity with data visualization tools, such as D3.js or Tableau
- Experience with containerization and orchestration tools, such as Docker and Kubernetes
- Understanding of financial markets and trading platforms
If you are a passionate leader with a proven track record of building innovative and disruptive products and teams, we would love to hear from you.
- Worked on Python, with knowledge of Scrapy framework & Beautiful Soup.
- Maintaining the running web crawlers full-stack application.
- Create more/better ways to crawl relevant information
- Builds and maintain new API integrations to support continuing increases in data volume and complexity
- Python Tech stack (Python library: Requests, Urllib, Beautifulsoup).
- Good communication skills.
world’s first real-time opportunity engine. We constantly cr
● Statistics - Always makes data-driven decisions using tools from statistics, such as: populations and
sampling, normal distribution and central limit theorem, mean, median, mode, variance, standard
deviation, covariance, correlation, p-value, expected value, conditional probability and Bayes's theorem
● Machine Learning
○ Solid grasp of attention mechanism, transformers, convolutions, optimisers, loss functions,
LSTMs, forget gates, activation functions.
○ Can implement all of these from scratch in pytorch, tensorflow or numpy.
○ Comfortable defining own model architectures, custom layers and loss functions.
● Modelling
○ Comfortable with using all the major ML frameworks (pytorch, tensorflow, sklearn, etc) and NLP
models (not essential). Able to pick the right library and framework for the job.
○ Capable of turning research and papers into operational execution and functionality delivery.
Roles and Responsibilities
- Managing available resources such as hardware, data, and personnel so that deadlines are met.
- Analyzing the ML and Deep Learning algorithms that could be used to solve a given problem and ranking them by their success probabilities
- Exploring data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world
- Defining validation framework and establish a process to ensure acceptable data quality criteria are met
- Supervising the data acquisition and partnership roadmaps to create stronger product for our customers.
- Defining feature engineering process to ensure usage of meaningful features given the business constraints which may vary by market
- Device self-learning strategies through analysis of errors from the models
- Understand business issues and context, devise a framework for solving unstructured problems and articulate clear and actionable solutions underpinned by analytics.
- Manage multiple projects simultaneously while demonstrating business leadership to collaborate & coordinate with different functions to deliver the solutions in a timely, efficient and effective manner.
- Manage project resources optimally to deliver projects on time; drive innovation using residual resources to create strong solution pipeline; provide direction, coaching & training, feedbacks to project team members to enhance performance, support development and encourage value aligned behaviour of the project team members; Provide inputs for periodic performance appraisal of project team members.
Preferred Technical & Professional expertise
- Undergraduate Degree in Computer Science / Engineering / Mathematics / Statistics / economics or other quantitative fields
- At least 2+ years of experience of managing Data Science projects with specializations in Machine Learning
- In-depth knowledge of cloud analytics tools.
- Able to drive Python Code optimization; ability review codes and provide inputs to improve the quality of codes
- Ability to evaluate hardware selection for running ML models for optimal performance
- Up to date with Python libraries and versions for machine learning; Extensive hands-on experience with Regressors; Experience working with data pipelines.
- Deep knowledge of math, probability, statistics and algorithms; Working knowledge of Supervised Learning, Adversarial Learning and Unsupervised learning
- Deep analytical thinking with excellent problem-solving abilities
- Strong verbal and written communication skills with a proven ability to work with all levels of management; effective interpersonal and influencing skills.
- Ability to manage a project team through effectively allocation of tasks, anticipating risks and setting realistic timelines for managing the expectations of key stakeholders
- Strong organizational skills and an ability to balance and handle multiple concurrent tasks and/or issues simultaneously.
- Ensure that the project team understand and abide by compliance framework for policies, data, systems etc. as per group, region and local standards
2-5 yrs of proven experience in ML, DL, and preferably NLP.
Preferred Educational Background - B.E/B.Tech, M.S./M.Tech, Ph.D.
𝐖𝐡𝐚𝐭 𝐰𝐢𝐥𝐥 𝐲𝐨𝐮 𝐰𝐨𝐫𝐤 𝐨𝐧?
𝟏) Problem formulation and solution designing of ML/NLP applications across complex well-defined as well as open-ended healthcare problems.
2) Cutting-edge machine learning, data mining, and statistical techniques to analyse and utilise large-scale structured and unstructured clinical data.
3) End-to-end development of company proprietary AI engines - data collection, cleaning, data modelling, model training / testing, monitoring, and deployment.
4) Research and innovate novel ML algorithms and their applications suited to the problem at hand.
𝐖𝐡𝐚𝐭 𝐚𝐫𝐞 𝐰𝐞 𝐥𝐨𝐨𝐤𝐢𝐧𝐠 𝐟𝐨𝐫?
𝟏) Deeper understanding of business objectives and ability to formulate the problem as a Data Science problem.
𝟐) Solid expertise in knowledge graphs, graph neural nets, clustering, classification.
𝟑) Strong understanding of data normalization techniques, SVM, Random forest, data visualization techniques.
𝟒) Expertise in RNN, LSTM, and other neural network architectures.
𝟓) DL frameworks: Tensorflow, Pytorch, Keras
𝟔) High proficiency with standard database skills (e.g., SQL, MongoDB, Graph DB), data preparation, cleaning, and wrangling/munging.
𝟕) Comfortable with web scraping, extracting, manipulating, and analyzing complex, high-volume, high-dimensionality data from varying sources.
𝟖) Experience with deploying ML models on cloud platforms like AWS or Azure.
9) Familiarity with version control with GIT, BitBucket, SVN, or similar.
𝐖𝐡𝐲 𝐜𝐡𝐨𝐨𝐬𝐞 𝐮𝐬?
𝟏) We offer Competitive remuneration.
𝟐) We give opportunities to work on exciting and cutting-edge machine learning problems so you contribute towards transforming the healthcare industry.
𝟑) We offer flexibility to choose your tools, methods, and ways to collaborate.
𝟒) We always value and believe in new ideas and encourage creative thinking.
𝟓) We offer open culture where you will work closely with the founding team and have the chance to influence the product design and execution.
𝟔) And, of course, the thrill of being part of an early-stage startup, launching a product, and seeing it in the hands of the users.
- Manages the delivery of large, complex Data Science projects using appropriate frameworks and collaborating with stake holders to manage scope and risk. Help the AI/ML Solution
- Analyst to build solution as per customer need on our platform Newgen AI Cloud. Drives profitability and continued success by managing service quality and cost and leading delivery. Proactively support sales through innovative solutions and delivery excellence.
Work location: Gurugram
Key Responsibilities:
1 Collaborate/contribute to all project phases, technical know to design, develop solutions and deploy at customer end.
2 End-to-end implementations i.e. gathering requirements, analysing, designing, coding, deployment to Production
3 Client facing role talking to client on regular basis to get requirement clarification
4. Lead the team
Core Tech Skills: Azure, Cloud Computing, Java/Scala, Python, Design Patterns and fair knowledge of Data Science. Fair Knowledge of Data Lake/DWH
Educational Qualification: Engineering graduate preferably Computer since graduate
Top Management Consulting Company
We are looking for a Machine Learning engineer for on of our premium client.
Experience: 2-9 years
Location: Gurgaon/Bangalore
Tech Stack:
Python, PySpark, the Python Scientific Stack; MLFlow, Grafana, Prometheus for machine learning pipeline management and monitoring; SQL, Airflow, Databricks, our own open-source data pipelining framework called Kedro, Dask/RAPIDS; Django, GraphQL and ReactJS for horizontal product development; container technologies such as Docker and Kubernetes, CircleCI/Jenkins for CI/CD, cloud solutions such as AWS, GCP, and Azure as well as Terraform and Cloudformation for deployment
1. ROLE AND RESPONSIBILITIES
1.1. Implement next generation intelligent data platform solutions that help build high performance distributed systems.
1.2. Proactively diagnose problems and envisage long term life of the product focusing on reusable, extensible components.
1.3. Ensure agile delivery processes.
1.4. Work collaboratively with stake holders including product and engineering teams.
1.5. Build best-practices in the engineering team.
2. PRIMARY SKILL REQUIRED
2.1. Having a 2-6 years of core software product development experience.
2.2. Experience of working with data-intensive projects, with a variety of technology stacks including different programming languages (Java,
Python, Scala)
2.3. Experience in building infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data
sources to support other teams to run pipelines/jobs/reports etc.
2.4. Experience in Open-source stack
2.5. Experiences of working with RDBMS databases, NoSQL Databases
2.6. Knowledge of enterprise data lakes, data analytics, reporting, in-memory data handling, etc.
2.7. Have core computer science academic background
2.8. Aspire to continue to pursue career in technical stream
3. Optional Skill Required:
3.1. Understanding of Big Data technologies and Machine learning/Deep learning
3.2. Understanding of diverse set of databases like MongoDB, Cassandra, Redshift, Postgres, etc.
3.3. Understanding of Cloud Platform: AWS, Azure, GCP, etc.
3.4. Experience in BFSI domain is a plus.
4. PREFERRED SKILLS
4.1. A Startup mentality: comfort with ambiguity, a willingness to test, learn and improve rapidl
- Collaborate with the business teams to understand the data environment in the organization; develop and lead the Data Scientists team to test and scale new algorithms through pilots and subsequent scaling up of the solutions
- Influence, build and maintain the large-scale data infrastructure required for the AI projects, and integrate with external IT infrastructure/service
- Act as the single point source for all data related queries; strong understanding of internal and external data sources; provide inputs in deciding data-schemas
- Design, develop and maintain the framework for the analytics solutions pipeline
- Provide inputs to the organization’s initiatives on data quality and help implement frameworks and tools for the various related initiatives
- Work in cross-functional teams of software/machine learning engineers, data scientists, product managers, and others to build the AI ecosystem
- Collaborate with the external organizations including vendors, where required, in respect of all data-related queries as well as implementation initiatives
- Work closely with your business to identify issues and use data to propose solutions for effective decision making
- Build algorithms and design experiments to merge, manage, interrogate and extract data to supply tailored reports to colleagues, customers or the wider organisation.
- Creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc
- Querying databases and using statistical computer languages: R, Python, SLQ, etc.
- Visualizing/presenting data through various Dashboards for Data Analysis, Using Python Dash, Flask etc.
- Test data mining models to select the most appropriate ones for use on a project
- Work in a POSIX/UNIX environment to run/deploy applications
- Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
- Develop custom data models and algorithms to apply to data sets.
- Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes.
- Assess the effectiveness of data sources and data-gathering techniques and improve data collection methods
- Horizon scan to stay up to date with the latest technology, techniques and methods
- Coordinate with different functional teams to implement models and monitor outcomes.
- Stay curious and enthusiastic about using algorithms to solve problems and enthuse others to see the benefit of your work.
General Expectations:
- Able to create algorithms to extract information from large data sets
- Strong knowledge of Python, R, Java or another scripting/statistical languages to automate data retrieval, manipulation and analysis.
- Experience with extracting and aggregating data from large data sets using SQL or other tools
- Strong understanding of various NLP, and NLU techniques like Named Entity Recognition, Summarization, Topic Modeling, Text Classification, Lemmatization and Stemming.
- Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest, Boosting, Trees, etc.
- Experience with Python libraries such as Pandas, NumPy, SciPy, Scikit-Learn
- Experience with Jupyter / Pandas / Numpy to manipulate and analyse data
- Knowledge of Machine Learning techniques and their respective pros and cons
- Strong Knowledge of various Data Science Visualization Tools like Tableau, PowerBI, D3, Plotly, etc.
- Experience using web services: Redshift, AWS, S3, Spark, DigitalOcean, etc.
- Proficiency in using query languages, such as SQL, Spark DataFrame API, etc.
- Hands-on experience in HTML, CSS, Bootstrap, JavaScript, AJAX, jQuery and Prototyping.
- Hands-on experience on C#, Javascript, .Net
- Experience in understanding and analyzing data using statistical software (e.g., Python, R, KDB+ and other relevant libraries)
- Experienced in building applications that meet enterprise needs – secure, scalable, loosely coupled design
- Strong knowledge of computer science, algorithms, and design patterns
- Strong oral and written communication, and other soft skills critical to collaborating and engage with teams
Hi,
Enterprise minds is looking for Data Scientist.
Strong in Python,Pyspark.
Prefer immediate joiners
Our Company
Tealbox.Digital is a Marketing-Technology company founded by IIT-Delhi alumni, focused on solving complex problems for businesses operating online. We leverage Paid Media to help clients all over the globe acquire new customers, retain existing ones and maximize customer lifetime value. We have been involved with several pre-sales startups that approached us with the intent to establish their proof of concept and we have enabled those businesses to grow from there to seven-figure turnovers. Our performances have impacted businesses so largely that we were able to set them up to raise capital from large investors. We currently operate in four continents and are growing steadily owing to the current need for businesses to go online and efficient performance marketing solutions to keep this channel sustainable. The businesses we largely work with are small-scale enterprises and startups and we are directly responsible for shaping how their business grows.
TealBox Digital is an Indian startup with a global outlook. We are an equal opportunity employer and are committed to creating exceptional employee experiences. We believe in empowering people by focusing on employee development and investing in continuous learning opportunities. We’re committed to helping people thrive professionally and personally.
Job Description
We are looking for curious individuals who are natural leaders and have the ability to recognise what needs to be done. We require candidates with strong written & verbal communication skills and analytical thinkers. The ideal candidate will need to be able to effectively articulate insights and recommended actions.
In this role,
- You need to analyze and solve increasingly complex business problems.
- You will be working as a part of the Digital Marketing & Customer Analytics team which provides a set of processes that measure, manage and analyze marketing activities in order to provide actionable insights and recommendations to clients’ advertising campaigns to optimize ROI & performance efficiency in operations.
- You will have experience in transforming large amounts of diverse business data into valuable and meaningful information that can be used to support campaign-wise decision-making.
- Design and carry out high quality analysis in service of projects undertaken on behalf of clients and ensure the analysis is presented in a compelling way.
- You will have good written and oral communication skills in English language and the ability to transform data into actionable insights for an audience composed of digital experts, novices and beginners.
- Conceptualize and implement Paid Media strategies to acquire new customers and grow client businesses.
- Stay informed of the relevant industry, paid media, and paid media platform trends and best practices.
- The ideal candidate is someone who enjoys solving problems. They love taking on difficult challenges and finding creative solutions and does not get flustered easily.
Requirements
- Graduates with an aptitude to pick up new skills or 4-6 years’ experience in the marketing segment with a clear understanding of digital marketing or individuals with equivalent/relevant experience.
- Experience with FB Ads, Google Ads, Amazon Ads, Programmatic Buying and other allied tools such as Google Analytics, Google Tag Manager etc, is a plus but NOT mandatory.
- Strong presentation skills and ability to work with diverse stakeholders.
- Ability to analyze large data sets and use analytical skills to create easy-to-understand, actionable insights for business stakeholders.
- Intermediate knowledge/experience with Google Sheets, Google Slides, Google Data Studio, Python, etc.
- Strong math/analytical and quantitative skills.
- Strong problem-solving skills.
- Very strong oral and written communication skills.
- Intermediate knowledge/experience with relational databases and report development.
- Close attention to detail, accuracy, follow-through and excellent organizational skills.
- Proof of ability to take up responsibility and deliver on actions
A Bachelor’s degree in data science, statistics, computer science, or a similar field
2+ years industry experience working in a data science role, such as statistics, machine learning,
deep learning, quantitative financial analysis, data engineering or natural language processing
Domain experience in Financial Services (banking, insurance, risk, funds) is preferred
Have and experience and be involved in producing and rapidly delivering minimum viable products,
results focused with ability to prioritize the most impactful deliverables
Strong Applied Statistics capabilities. Including excellent understanding of Machine Learning
techniques and algorithms
Hands on experience preferable in implementing scalable Machine Learning solutions using Python /
Scala / Java on Azure, AWS or Google cloud platform
Experience with storage frameworks like Hadoop, Spark, Kafka etc
Experience in building &deploying unsupervised, semi-supervised, and supervised models and be
knowledgeable in various ML algorithms such as regression models, Tree-based algorithms,
ensemble learning techniques, distance-based ML algorithms etc
Ability to track down complex data quality and data integration issues, evaluate different algorithmic
approaches, and analyse data to solve problems.
Experience in implementing parallel processing and in-memory frameworks such as H2O.ai
Cogoport Story
Do you prefer to get speeding tickets or parking tickets?
Because at Cogoport we are speeding ahead to do something remarkable for the world. We are trying to solve the Trade Knowledge and Execution Gap, which is currently widening and preventing trade to the tune of $3.4 trillion annually. This Gap has enormous economic as well as human impact and disproportionately hits small and medium businesses globally.
The team at Cogoport is working on developing a new category, the Global Trade Platform, that helps companies discover and connect with appropriate trade partners, optimize shipping and logistics options and costs, and improve cash management and cash flow.
Cogoport is currently in hypergrowth mode. We are proud to have been named an Asia-Pacific High-Growth Company by the Financial Times and an Indian Growth Champion by the Economic Times. We are aiming to reach an annualized revenue of $1 billion (7700 Crores INR) by this summer and are hiring over 500 additional employees. We are currently hiring in Mumbai, Gurgaon, Chennai and Bangalore.
Cogoport Culture: We have two core values at Cogoport—Intrapreneurship and Customer-centricity. If you share these values and are a hard worker who is willing to take risks (and occasionally get a speeding ticket), you can make a huge impact and propel your career in an endless number of directions with Cogoport.
Cogoport Leadership
https://www.linkedin.com/in/purnendushekhar/">https://www.linkedin.com/in/purnendushekhar/
https://www.linkedin.com/in/amitabhshankar/">https://www.linkedin.com/in/amitabhshankar/
Life at Cogoport: It’s rare to be able to join a company that can give you the resources, support and technology you need to break new ground and see your ideas come to life. You’ll be surrounded by some of the smartest engineers and commercial specialists in India and the Asia Pacific Region.
With huge growth and the right entrepreneurial mindset, comes huge opportunities! So, wherever you join us, you’ll be able to dream, deliver better and brighter solutions, and speed ahead with the possibility to propel your career forward in endless directions as our company continues to grow and expand.
For more insights about the company: https://cogoport.com/about">https://cogoport.com/about
Why Cogoport?
International Trade can be complicated at times and every day brings new challenges and opportunities to learn. When we simplify international trade, it empowers and affects every human being on the face of this earth. Seven billion people - one common problem.
As a part of the Talent team at Cogoport, you will get an opportunity to be a part of an industry-wide revolution in the world of shipping and logistics by collaborating with other brilliant minds to resolve real world on-ground challenges. You will have a direct impact on the revenue and profitability growth for the organization.
Areas of Impact for you
- Hands-on management with deep-dive into the details of software design, implementation and debugging.
- Guide your teams in developing roadmaps and systems to drive product growth, then identify, plan, and execute projects to support that growth.Manage multiple projects across a wide breadth of technologies, coordinate dependencies, and interactions with the internal teams and external partners.
- Collaborate with stakeholders from across functions to keep the development team in sync with all functions and overall business objectives.
- Develop large multi-tenant applications in Rails.
- Understand Rails best practices and religiously introduce those to codebase.
- Set up, create and manage strong best practices/architecture to ensure reliable, secure, bug-free, and performant software is released on-time.
Desirable Skills and Experience
- Loves coding.
- 4-6 years of experience managing technology teams.
- Demonstrated ability to build complex scalable technology products.
- Should have prior experience of working with ROR, React, PostgreSQL and cloud infra.
- Understanding scaling strategies for a high-traffic Rails applications.
- Understanding O-Auth2 or JWT (Json Web Token) authentication mechanisms.
- Experience in using ActiveRecordSerialize, RSpec and active interaction.Engin
- Knowledge about Asynchronous Networking in Ruby; Refactoring ActiveRecord Models; Background Job processing using Redis and Sidekiq; Writing automated Deployment Scripts using Capistrano, Ansible etc.
- Expertise in Data Science and Machine Learning is a plus.
- Expertise in Jenkins, Kubernetes, dockers and cloud technology is a plus.
Cogoport is an equal opportunity employer. We are a welcoming place for everyone, and we do our best to make sure all people feel supported and respected at work.
Cogoport Story
Do you prefer to get speeding tickets or parking tickets?
Because at Cogoport we are speeding ahead to do something remarkable for the world. We are trying to solve the Trade Knowledge and Execution Gap, which is currently widening and preventing trade to the tune of $3.4 trillion annually. This Gap has enormous economic as well as human impact and disproportionately hits small and medium businesses globally.
The team at Cogoport is working on developing a new category, the Global Trade Platform, that helps companies discover and connect with appropriate trade partners, optimize shipping and logistics options and costs, and improve cash management and cash flow.
Cogoport is currently in hypergrowth mode. We are proud to have been named an Asia-Pacific High-Growth Company by the Financial Times and an Indian Growth Champion by the Economic Times. We are aiming to reach an annualized revenue of $1 billion (7700 Crores INR) by this summer and are hiring over 500 additional employees. We are currently hiring in Mumbai, Gurgaon, Chennai and Bangalore.
Cogoport Culture: We have two core values at Cogoport—Intrapreneurship and Customer-centricity. If you share these values and are a hard worker who is willing to take risks (and occasionally get a speeding ticket), you can make a huge impact and propel your career in an endless number of directions with Cogoport.
Cogoport Leadership
https://www.linkedin.com/in/purnendushekhar/">https://www.linkedin.com/in/purnendushekhar/
https://www.linkedin.com/in/amitabhshankar/">https://www.linkedin.com/in/amitabhshankar/
Life at Cogoport: It’s rare to be able to join a company that can give you the resources, support and technology you need to break new ground and see your ideas come to life. You’ll be surrounded by some of the smartest engineers and commercial specialists in India and the Asia Pacific Region.
With huge growth and the right entrepreneurial mindset, comes huge opportunities! So, wherever you join us, you’ll be able to dream, deliver better and brighter solutions, and speed ahead with the possibility to propel your career forward in endless directions as our company continues to grow and expand.
For more insights about the company: https://cogoport.com/about">https://cogoport.com/about
Why Cogoport?
International Trade can be complicated at times and every day brings new challenges and opportunities to learn. When we simplify international trade, it empowers and affects every human being on the face of this earth. Seven billion people - one common problem.
As a part of the Talent team at Cogoport, you will get an opportunity to be a part of an industry-wide revolution in the world of shipping and logistics by collaborating with other brilliant minds to resolve real world on-ground challenges. You will have a direct impact on the revenue and profitability growth for the organization.
Areas of Impact for you
- Hands-on management with deep-dive into the details of software design, implementation and debugging.
- Guide your teams in developing roadmaps and systems to drive product growth, then identify, plan, and execute projects to support that growth.
- Manage multiple projects across a wide breadth of technologies, coordinate dependencies, and interactions with the internal teams and external partners.
- Collaborate with stakeholders from across functions to keep the development team in sync with all functions and overall business objectives.
- Develop large multi-tenant applications in Rails.
- Understand Rails best practices and religiously introduce those to the codebase.
- Set up, create and manage strong best practices/architecture to ensure reliable, secure, bug-free, and performant software is released on-time.
Desirable Skills and Experience
- Loves coding.
- 2-4 years of experience building scalable & complex products from scratch.
- Demonstrated ability to build complex scalable technology products.
- Should have prior experience of working with ROR, React, PostgreSQL and cloud infra.
- Understanding scaling strategies for high-traffic Rails applications.
- Understanding O-Auth2 or JWT (Json Web Token) authentication mechanisms.
- Experience in using ActiveRecordSerialize, RSpec and active interaction.
- Knowledge about Asynchronous Networking in Ruby; Refactoring ActiveRecord Models; Background Job processing using Redis and Sidekiq; Writing automated Deployment Scripts using Capistrano, Ansible etc.
- Expertise in Data Science and Machine Learning is a pl
- Expertise in Jenkins, Kubernetes, dockers and cloud technology is a plus.
Cogoport is an equal opportunity employer. We are a welcoming place for everyone, and we do our best to make sure all people feel supported and respected at work.
Job brief
We are looking for a Data Scientist to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.
In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.
Your goal will be to help our company analyze trends to make better decisions.
Requirements
1. 2 to 4 years of relevant industry experience
2. Experience in Linear algebra, statistics & Probability skills, such as distributions, Deep Learning, Machine Learning
3. Strong mathematical and statistics background is a must
4. Experience in machine learning frameworks such as Tensorflow, Caffe, PyTorch, or MxNet
5. Strong industry experience in using design patterns, algorithms and data structures
6. Industry experience in using feature engineering, model performance tuning, and optimizing machine learning models
7. Hands on development experience in Python and packages such as NumPy, Sci-Kit Learn and Matplotlib
8. Experience in model building, hyper
Role :
- Understand and translate statistics and analytics to address business problems
- Responsible for helping in data preparation and data pull, which is the first step in machine learning
- Should be able to do cut and slice data to extract interesting insights from the data
- Model development for better customer engagement and retention
- Hands on experience in relevant tools like SQL(expert), Excel, R/Python
- Working on strategy development to increase business revenue
Requirements:
- Hands on experience in relevant tools like SQL(expert), Excel, R/Python
- Statistics: Strong knowledge of statistics
- Should able to do data scraping & Data mining
- Be self-driven, and show ability to deliver on ambiguous projects
- An ability and interest in working in a fast-paced, ambiguous and rapidly-changing environment
- Should have worked on Business Projects for an organization, Ex: customer acquisition, Customer retention.
at Persistent Systems
We have a urgent requirement for the post of IBM MDM (AE) profile
Notice period - should b e 15-30 days
Lead QA: more than 5 years experience , led the team of more than 5 people in big data platform, should have experience in Test Automation framework, should have experience of Test process documentation
client of peoplefirst consultants
Skills: Machine Learning,Deep Learning,Artificial Intelligence,python.
Location:Chennai
Domain knowledge: Data cleaning, modelling, analytics, statistics, machine learning, AI
Requirements:
· To be part of Digital Manufacturing and Industrie 4.0 projects across Saint Gobain group of companies
· Design and develop AI//ML models to be deployed across SG factories
· Knowledge on Hadoop, Apache Spark, MapReduce, Scala, Python programming, SQL and NoSQL databases is required
· Should be strong in statistics, data analysis, data modelling, machine learning techniques and Neural Networks
· Prior experience in developing AI and ML models is required
· Experience with data from the Manufacturing Industry would be a plus
Roles and Responsibilities:
· Develop AI and ML models for the Manufacturing Industry with a focus on Energy, Asset Performance Optimization and Logistics
· Multitasking, good communication necessary
· Entrepreneurial attitude.
at Navana Tech
Must have experience on e-commerce projects
Understand business problems and translate business requirements into technical requirements.
Conduct complex data analysis to ensure data quality & reliability i.e., make the data talk by extracting, preparing, and transforming it.
Identify, develop and implement statistical techniques and algorithms to address business challenges and add value to the organization.
Gather requirements and communicate findings in the form of a meaningful story with the stakeholders.
Build & implement data models using predictive modelling techniques. Interact with clients and provide support for queries and delivery
adoption.
Lead and mentor data analysts.
What we are looking for-
Apart from your love for data and ability to code even while sleeping you would need the following.
Minimum of 02 years of experience in designing and delivery of data science solutions.
You should have successful projects of retail/BFSI/FMCG/Manufacturing/QSR in your kitty to show-off.
Deep understanding of various statistical techniques, mathematical models, and algorithms to start the conversation with the data in hand.
Ability to choose the right model for the data and translate that into a code using R, Python, VBA, SQL, etc.
Bachelors/Masters degree in Engineering/Technology or MBA from
Tier-1 B School or MSc. in Statistics or Mathematics.
2. Build large datasets that will be used to train the models
3. Empirically evaluate related research works
4. Train and evaluate deep learning architectures on multiple large scale datasets
5. Collaborate with the rest of the research team to produce high-quality research
Global SaaS product built to help revenue teams. (TP1)
- You'd have to set up your own shop, work with design customers to find generalizable use cases, and build them out.
- Ability to collaborate with cross-functional teams to build and ship new features
- At least 2-5 years of experience
- Predictive Analytics – Machine Learning Algorithms, Logistics & Linear Regression, Decision Tree, Clustering.
- Exploratory Data Analysis – Data Preparation, Data Exploration, and Data Visualization.
- Analytics Tools – R, Python, SQL, Power BI, MS Excel.
Job Responsibilities:-
- Develop robust, scalable and maintainable machine learning models to answer business problems against large data sets.
- Build methods for document clustering, topic modeling, text classification, named entity recognition, sentiment analysis, and POS tagging.
- Perform elements of data cleaning, feature selection and feature engineering and organize experiments in conjunction with best practices.
- Benchmark, apply, and test algorithms against success metrics. Interpret the results in terms of relating those metrics to the business process.
- Work with development teams to ensure models can be implemented as part of a delivered solution replicable across many clients.
- Knowledge of Machine Learning, NLP, Document Classification, Topic Modeling and Information Extraction with a proven track record of applying them to real problems.
- Experience working with big data systems and big data concepts.
- Ability to provide clear and concise communication both with other technical teams and non-technical domain specialists.
- Strong team player; ability to provide both a strong individual contribution but also work as a team and contribute to wider goals is a must in this dynamic environment.
- Experience with noisy and/or unstructured textual data.
knowledge graph and NLP including summarization, topic modelling etc
- Strong coding ability with statistical analysis tools in Python or R, and general software development skills (source code management, debugging, testing, deployment, etc.)
- Working knowledge of various text mining algorithms and their use-cases such as keyword extraction, PLSA, LDA, HMM, CRF, deep learning & recurrent ANN, word2vec/doc2vec, Bayesian modeling.
- Strong understanding of text pre-processing and normalization techniques, such as tokenization,
- POS tagging and parsing and how they work at a low level.
- Excellent problem solving skills.
- Strong verbal and written communication skills
- Masters or higher in data mining or machine learning; or equivalent practical analytics / modelling experience
- Practical experience in using NLP related techniques and algorithms
- Experience in open source coding and communities desirable.
Able to containerize Models and associated modules and work in a Microservices environment
A content consumption and discovery app which provides news
Data Scientist
Requirements
● B.Tech/Masters in Mathematics, Statistics, Computer Science or another
quantitative field
● 2-3+ years of work experience in ML domain ( 2-5 years experience )
● Hands-on coding experience in Python
● Experience in machine learning techniques such as Regression, Classification,
Predictive modeling, Clustering, Deep Learning stack, NLP
● Working knowledge of Tensorflow/PyTorch
Optional Add-ons-
● Experience with distributed computing frameworks: Map/Reduce, Hadoop, Spark
etc.
● Experience with databases: MongoDB
Job Description
Do you have a passion for computer vision and deep learning problems? We are looking for someone who thrives on collaboration and wants to push the boundaries of what is possible today! Material Depot (materialdepot.in) is on a mission to be India’s largest tech company in the Architecture, Engineering and Construction space by democratizing the construction ecosystem and bringing stakeholders onto a common digital platform. Our engineering team is responsible for developing Computer Vision and Machine Learning tools to enable digitization across the construction ecosystem. The founding team includes people from top management consulting firms and top colleges in India (like BCG, IITB), and have worked extensively in the construction space globally and is funded by top Indian VCs.
Our team empowers Architectural and Design Businesses to effectively manage their day to day operations. We are seeking an experienced, talented Data Scientist to join our team. You’ll be bringing your talents and expertise to continue building and evolving our highly available and distributed platform.
Our solutions need complex problem solving in computer vision that require robust, efficient, well tested, and clean solutions. The ideal candidate will possess the self-motivation, curiosity, and initiative to achieve those goals. Analogously, the candidate is a lifelong learner who passionately seeks to improve themselves and the quality of their work. You will work together with similar minds in a unique team where your skills and expertise can be used to influence future user experiences that will be used by millions.
In this role, you will:
- Extensive knowledge in machine learning and deep learning techniques
- Solid background in image processing/computer vision
- Experience in building datasets for computer vision tasks
- Experience working with and creating data structures / architectures
- Proficiency in at least one major machine learning framework
- Experience visualizing data to stakeholders
- Ability to analyze and debug complex algorithms
- Good understanding and applied experience in classic 2D image processing and segmentation
- Robust semantic object detection under different lighting conditions
- Segmentation of non-rigid contours in challenging/low contrast scenarios
- Sub-pixel accurate refinement of contours and features
- Experience in image quality assessment
- Experience with in depth failure analysis of algorithms
- Highly skilled in at least one scripting language such as Python or Matlab and solid experience in C++
- Creativity and curiosity for solving highly complex problems
- Excellent communication and collaboration skills
- Mentor and support other technical team members in the organization
- Create, improve, and refine workflows and processes for delivering quality software on time and with carefully calculated debt
- Work closely with product managers, customer support representatives, and account executives to help the business move fast and efficiently through relentless automation.
How you will do this:
- You’re part of an agile, multidisciplinary team.
- You bring your own unique skill set to the table and collaborate with others to accomplish your team’s goals.
- You prioritize your work with the team and its product owner, weighing both the business and technical value of each task.
- You experiment, test, try, fail, and learn continuously.
- You don’t do things just because they were always done that way, you bring your experience and expertise with you and help the team make the best decisions.
For this role, you must have:
- Strong knowledge of and experience with the functional programming paradigm.
- Experience conducting code reviews, providing feedback to other engineers.
- Great communication skills and a proven ability to work as part of a tight-knit team.
About the Company
Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!
We are looking for a Data Lead - someone who works at the intersection of data science, GIS, and engineering. We want a leader who not only understands environmental data but someone who can quickly assemble large scale datasets that are crucial to the well being of our planet. Come save the planet with us!
Your Role
Manage: As a leadership position, this requires long term strategic thinking. You will be in charge of daily operations of the data team. This would include running team standups, planning the execution of data generation and ensuring the algorithms are put in production. You will also be the person in charge to dumb down the data science for the rest of us who do not know what it means.
Love and Live Data: You will also be taking all the responsibility of ensuring that the data we generate is accurate, clean, and is ready to use for our clients. This would entail that you understand what the market needs, calculate feasibilities and build data pipelines. You should understand the algorithms that we use or need to use and take decisions on what would serve the needs of our clients well. We also want our Data Lead to be constantly probing for newer and optimized ways of generating datasets. It would help if they were abreast of all the latest developments in the data science and environmental worlds. The Data Lead also has to be able to work with our Platform team on integrating the data on our platform and API portal.
Collaboration: We use Clubhouse to track and manage our projects across our organization - this will require you to collaborate with the team and follow up with members on a regular basis. About 50% of the work, needs to be the pulse of the platform team. You'll collaborate closely with peers from other functions—Design, Product, Marketing, Sales, and Support to name a few—on our overall product roadmap, on product launches, and on ongoing operations. You will find yourself working with the product management team to define and execute the feature roadmap. You will be expected to work closely with the CTO, reporting on daily operations and development. We don't believe in a top-down hierarchical approach and are transparent with everyone. This means honest and mutual feedback and ability to adapt.
Teaching: Not exactly in the traditional sense. You'll recruit, coach, and develop engineers while ensuring that they are regularly receiving feedback and making rapid progress on personal and professional goals.
Humble and cool: Look we will be upfront with you about one thing - our team is fairly young and is always buzzing with work. In this fast-paced setting, we are looking for someone who can stay cool, is humble, and is willing to learn. You are adaptable, can skill up fast, and are fearless at trying new methods. After all, you're in the business of saving the planet!
Requirements
- A minimum of 5 years of industry experience.
- Hyper-curious!
- Exceptional at Remote Sensing Data, GIS, Data Science.
- Must have big data & data analytics experience
- Very good in documentation & speccing datasets
- Experience with AWS Cloud, Linux, Infra as Code & Docker (containers) is a must
- Coordinate with cross-functional teams (DevOPS, QA, Design etc.) on planning and execution
- Lead, mentor and manage deliverables of a team of talented and highly motivated team of developers
- Must have experience in building, managing, growing & hiring data teams. Has built large-scale datasets from scratch
- Managing work on team's Clubhouse & follows up with the team. ~ 50% of work, needs to be the pulse of the platform team
- Exceptional communication skills & ability to abstract away problems & build systems. Should be able to explain to the management anything & everything
- Quality control - you'll be responsible for maintaining a high quality bar for everything your team ships. This includes documentation and data quality
- Experience of having led smaller teams, would be a plus.
Benefits
- Work from anywhere: Work by the beach or from the mountains.
- Open source at heart: We are building a community where you can use, contribute and collaborate on.
- Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
- Flexible timings: Fit your work around your lifestyle.
- Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
- Work Machine of choice: Buy a device and own it after completing a year at BSA.
- Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
- Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.
About the Company
Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!
We are looking for a data scientist to join its growing team. This position will require you to think and act on the geospatial architecture and data needs (specifically geospatial data) of the company. This position is strategic and will also require you to collaborate closely with data engineers, data scientists, software developers and even colleagues from other business functions. Come save the planet with us!
Your Role
Manage: It goes without saying that you will be handling large amounts of image and location datasets. You will develop dataframes and automated pipelines of data from multiple sources. You are expected to know how to visualize them and use machine learning algorithms to be able to make predictions. You will be working across teams to get the job done.
Analyze: You will curate and analyze vast amounts of geospatial datasets like satellite imagery, elevation data, meteorological datasets, openstreetmaps, demographic data, socio-econometric data and topography to extract useful insights about the events happening on our planet.
Develop: You will be required to develop processes and tools to monitor and analyze data and its accuracy. You will develop innovative algorithms which will be useful in tracking global environmental problems like depleting water levels, illegal tree logging, and even tracking of oil-spills.
Demonstrate: A familiarity with working in geospatial libraries such as GDAL/Rasterio for reading/writing of data, and use of QGIS in making visualizations. This will also extend to using advanced statistical techniques and applying concepts like regression, properties of distribution, and conduct other statistical tests.
Produce: With all the hard work being put into data creation and management, it has to be used! You will be able to produce maps showing (but not limited to) spatial distribution of various kinds of data, including emission statistics and pollution hotspots. In addition, you will produce reports that contain maps, visualizations and other resources developed over the course of managing these datasets.
Requirements
These are must have skill-sets that we are looking for:
- Excellent coding skills in Python (including deep familiarity with NumPy, SciPy, pandas).
- Significant experience with git, GitHub, SQL, AWS (S3 and EC2).
- Worked on GIS and is familiar with geospatial libraries such as GDAL and rasterio to read/write the data, a GIS software such as QGIS for visualisation and query, and basic machine learning algorithms to make predictions.
- Demonstrable experience implementing efficient neural network models and deploying them in a production environment.
- Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
- Capable of writing clear and lucid reports and demystifying data for the rest of us.
- Be curious and care about the planet!
- Minimum 2 years of demonstrable industry experience working with large and noisy datasets.
Benefits
- Work from anywhere: Work by the beach or from the mountains.
- Open source at heart: We are building a community where you can use, contribute and collaborate on.
- Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
- Flexible timings: Fit your work around your lifestyle.
- Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
- Work Machine of choice: Buy a device and own it after completing a year at BSA.
- Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
- Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.
● You have good understanding of the fundamentals of data science/algorithms or
software engineering
● Preferably you should have done some project or internship related to the field
● Knowledge of SQL is a plus
● A deep desire to learn new things and be a part of a vibrant start-up. You will
have a lot of freehand and this comes with immense responsibility - so it is
expected that you will be willing to master new things that come along!
What you will get to do?
● Build cloud-based services and/or user interfaces
● Participating in all aspects of software development activities, including design,
coding, code review, unit testing, bug fixing, and code/API documentation
● Be the first few members of a growing technology team
applied research.
● Understand, apply and extend state-of-the-art NLP research to better serve our customers.
● Work closely with engineering, product, and customers to scientifically frame the business problems and come up with the underlying AI models.
● Design, implement, test, deploy, and maintain innovative data and machine learning solutions to accelerate our business.
● Think creatively to identify new opportunities and contribute to high-quality publications or patents.
Desired Qualifications and Experience
● At Least 1 year of professional experience.
● Bachelors in Computer Science or related fields from the top colleges.
● Extensive knowledge and practical experience in one or more of the following areas: machine learning, deep learning, NLP, recommendation systems, information retrieval.
● Experience applying ML to solve complex business problems from scratch.
● Experience with Python and a deep learning framework like Pytorch/Tensorflow.
● Awareness of the state of the art research in the NLP community.
● Excellent verbal and written communication and presentation skills.
- 6+ months of proven experience as a Data Scientist or Data Analyst
- Understanding of machine-learning and operations research
- Extensive knowledge of R, SQL and Excel
- Analytical mind and business acumen
- Strong Statistical understanding
- Problem-solving aptitude
- BSc/BA in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative field is preferred
AI-powered cloud-based SaaS solution provider
- Manage end to end recruitment activities by sourcing the best talent from diverse sources and handle the entire Talent management right from joining to exit
- Strong network within product companies/startups
- Work with Marketing Team and execute ways to strengthen our brand awareness
- Review and benchmark the internal and external environment to improve HR/Talent Management
- Provide excellent candidate experience at each stage of the hiring process
- Provide Analytical and well-documented reports to the hiring managers.
- 3+ years experience in Product/Start-Up companies is a must
- Ability to network and strong interpersonal skills
- Worked on positions like Data Scientist, Engineering Manager, SDET, DevOps.
- Good track record of conversions
- Excellent Written Communication!
- An open-minded and positive attitude
- Adaptable and able to fit into the work environment of a fast-growing start-up
- Strong sense of ethical responsibility
- Skilled at spreading a good vibe
- Notice Period should be lesser than one month