We are looking for a Machine Learning engineer for on of our premium client.
Experience: 2-9 years
Location: Gurgaon/Bangalore
Tech Stack:
Python, PySpark, the Python Scientific Stack; MLFlow, Grafana, Prometheus for machine learning pipeline management and monitoring; SQL, Airflow, Databricks, our own open-source data pipelining framework called Kedro, Dask/RAPIDS; Django, GraphQL and ReactJS for horizontal product development; container technologies such as Docker and Kubernetes, CircleCI/Jenkins for CI/CD, cloud solutions such as AWS, GCP, and Azure as well as Terraform and Cloudformation for deployment
Similar jobs
WHO WE ARE:
TIFIN is a fintech platform backed by industry leaders including JP Morgan, Morningstar, Broadridge, Hamilton Lane and a who’s who of the financial service industry. We are creating engaging wealth experiences to better financial lives through AI and investment intelligence powered personalization. We are working to change the world of wealth in ways that personalization has changed the world of movies, music and more but with the added responsibility of delivering better wealth outcomes.
We use design and behavioural thinking to enable engaging experiences through software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes.
In a world where every individual is unique, we match them to financial advice and investments with a recognition of their distinct needs and goals across our investment marketplace and our advice and planning divisions.
OUR VALUES:
- Shared Understanding through Listening and Speaking the Truth. We communicate with radical candor, precision and compassion to create a shared understanding. We challenge, but once a decision is made, commit fully. We listen attentively, speak candidly.
- Teamwork for Teamwin. We believe in win together, learn together. We fly in formation. We cover each other’s backs. We inspire each other with our energy and attitude.
- Make Magic for our Users. We center around the voice of the customer. With deep empathy for our clients, we create technology that transforms investor experiences.
- Grow at the Edge. We are driven by personal growth. We get out of our comfort zone and keep egos aside to find our genius zones. We strive to be the best we can possibly be. No excuses.
- Innovate with Creative Solutions. We believe that disruptive innovation begins with curiosity and creativity. We challenge the status quo and problem solve to find new answers.
WHAT YOU'LL BE DOING:
We are looking for an experienced quantitative professional to develop, implement, test, and maintain the core algorithms and R&D framework for our Investment and investment advisory platform. The ideal candidate for this role has successfully implemented and maintained quantitative and statistical modules using modular software design constructs. The candidate needs to be a responsible product owner, a problem solver and a team player looking to make a significant impact on a fast-growing company. The successful candidate will directly report to the Head of Quant Research & Development.
Responsibilities:
- The end-to-end research, development, and maintenance of investment platform, data and algorithms
- Take part in building out the R&D back testing and simulation engines
- Thoroughly vet investment algorithmic results
- Contribute to the research data platform design
- Investigate datasets for use in new or existing algorithms
- Participate in agile development practices
- Liaise with stakeholders to gather & understand the functional requirements
- Take part in code reviews ensuring quality meets highest level of standards
- Develop software using high quality standards and best practices, conduct thorough end-to-end unit testing, and provide support during testing and post go-live
- Support research innovation through the creative and aggressive experimentation of cutting-edge hardware, software, processes, procedures, and methods
- Collaborate with technology teams to ensure appropriate requirements, standards, and integration
Qualifications / Skillsets:
- Experience in a quant research & development role
- Proficient in Python, Git and Jira
- Knowledge in SQL and database development (PostgreSQL is a plus)
- Understanding of R and RMarkdown is a plus
- Bachelor’s degree in computer science, computational mathematics, or financial engineering
- Master’s degree or advanced training is a strong plus
- Excellent mathematical foundation and hands-on experience working in the finance industry
- Proficient in quantitative, statistical, and ML/AI techniques and their implementation using Python modules such as Pandas, NumPy, SciPy, SciKit-Learn, etc.
- Strong communication (written and oral) and analytical problem-solving skills
- Strong sense of attention to detail, pride in delivering high quality work and willingness to learn
- An understanding of or exposure to financial capital markets, various financial instruments (such as stocks, ETFs, Mutual Funds, etc.), and financial tools (such as Bloomberg, Reuters, etc.)
COMPENSATION AND BENEFITS PACKAGE:
Compensation: Competitive and commensurate to experience + discretionary annual bonus.
TIFIN offers a competitive benefits package that includes:
- Performance linked variable compensation
- Medical insurance
- Remote work flexibility and other company benefits
TIFIN is proud to be an equal opportunity workplace and values the multitude of talents and perspectives that a diverse workforce brings. All qualified applicants will receive consideration for employment without regard to any discrimination.
Responsibilities
-
Deliver full-cycle Tableau development projects, from business needs assessment and data discovery, through solution design, to delivery to client.
-
Enable our clients and ourselves to answer questions and develop data-driven insights through Tableau.
-
Provide technical leadership and support across all aspects of Tableau development and use, from data specification development, through DataMart development, to supporting end-user dashboards and reports.
-
Administrate Tableau Server by creating sites, add/remove users, and provide the appropriate level access for users.
-
Strategize and ideate the solution design. Develop UI mock-ups, storyboards, flow diagrams, conceptual diagrams, wireframes, visual mockups, and interactive prototypes.
-
Develop best practices guidelines for Tableau data processing and visualization. Use these best practices to quickly deliver functionality across the client base and internal users.
Qualifications
-
Degree in a highly-relevant analytical or technical field, such as statistics, data science, or business analytics.
· 5+ years as a Tableau developer and administrator.
· Extensive experience with large data sets, statistical analyses, and visualization as well as hands-on experience on tools (SQL, Tableau, Power BI).
· Ability to quickly learn and take responsibility to deliver.
Responsibilities:
- Should act as a technical resource for the Data Science team and be involved in creating and implementing current and future Analytics projects like data lake design, data warehouse design, etc.
- Analysis and design of ETL solutions to store/fetch data from multiple systems like Google Analytics, CleverTap, CRM systems etc.
- Developing and maintaining data pipelines for real time analytics as well as batch analytics use cases.
- Collaborate with data scientists and actively work in the feature engineering and data preparation phase of model building
- Collaborate with product development and dev ops teams in implementing the data collection and aggregation solutions
- Ensure quality and consistency of the data in Data warehouse and follow best data governance practices
- Analyse large amounts of information to discover trends and patterns
- Mine and analyse data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.\
Requirements
- Bachelor’s or Masters in a highly numerate discipline such as Engineering, Science and Economics
- 2-6 years of proven experience working as a Data Engineer preferably in ecommerce/web based or consumer technologies company
- Hands on experience of working with different big data tools like Hadoop, Spark , Flink, Kafka and so on
- Good understanding of AWS ecosystem for big data analytics
- Hands on experience in creating data pipelines either using tools or by independently writing scripts
- Hands on experience in scripting languages like Python, Scala, Unix Shell scripting and so on
- Strong problem solving skills with an emphasis on product development.
- Experience using business intelligence tools e.g. Tableau, Power BI would be an added advantage (not mandatory)
Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
WHY US
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
Description:
We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.
Responsibilities:
- Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
- Lead a team of ML engineers in developing an industrial AI product from scratch
- Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
- Tune the models to achieve high accuracy rates and minimum latency
- Deploying developed computer vision models on edge devices after optimization to meet customer requirements
Requirements:
- Bachelor’s degree
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- 4+ years of industrial experience in computer vision and/or deep learning
- Experience in taking an AI product from scratch to commercial deployment.
- Experience in Image enhancement, object detection, image segmentation, image classification algorithms
- Experience in deployment with OpenVINO, ONNXruntime and TensorRT
- Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
- Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
- Proficient understanding of code versioning tools, such as Git
Our perfect candidate is someone that:
- is proactive and an independent problem solver
- is a constant learner. We are a fast growing start-up. We want you to grow with us!
- is a team player and good communicator
What We Offer:
- You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
- You will be in charge of what you build and be an integral part of the product development process
- Technical and financial growth!
Job description
Helical Insight an open source Business Intelligence tool from Helical IT Solutions Pvt. Ltd,
based out of Hyderabad, is looking for fresher’s having strong knowledge on SQL. Helical
Insight has more than 50+ clients from various sectors. It has been awarded the most promising
company in the Business Intelligence space. We are looking for rockstar team mate to join
our company.
Job Brief
We are looking for a Business Intelligence (BI) Developer to create and manage BI and analytics
solutions that turn data into knowledge.
In this role, you should have a background in data and business analysis. You should be
analytical and an excellent communicator. If you also have a business acumen and
problemsolving aptitude, we’d like to meet you. Excellent knowledge on SQLQuery is required.
Basic knowledge on HTML CSS and JS is required.
You would be working closely with customers of various domain to understand their data,
understand their business requirement and deliver the required analytics in form of varous
reports dashboards etc. Excellent client interfacing role with opportunity to work across various
sectors and geographies as well as varioud kind of DB including NoSQL, RDBMS, graph db,
Columnar DB etc
Skill set and Qualification required
Responsibilities
Attending client calls to get requriement, show progress
Translate business needs to technical specifications
Design, build and deploy BI solutions (e.g. reporting tools)
Maintain and support data analytics platforms)
Conduct unit testing and troubleshooting
Evaluate and improve existing BI systems
Collaborate with teams to integrate systems
Develop and execute database queries and conduct analyses
Create visualizations and reports for requested projects
Develop and update technical documentation
Requirements
Excellent expertise on SQLQueries
Proven experience as a BI Developer or Data Scientist
Background in data warehouse design (e.g. dimensional modeling) and data mining
In-depth understanding of database management systems, online analytical processing
(OLAP) and ETL (Extract, transform, load) framework
Familiarity with BI technologies
Proven abilities to take initiative and be innovative
Analytical mind with a problem-solving aptitude
BE in Computer Science/IT
Education: BE/ BTech/ MCA/BCA/ MTech/ MS, or equivalent preferred.
Interested candidates call us on +91 7569 765 162
Job Title : Analyst / Sr. Analyst – Data Science Developer - Python
Exp : 2 to 5 yrs
Loc : B’lore / Hyd / Chennai
NP: Candidate should join us in 2 months (Max) / Immediate Joiners Pref.
About the role:
We are looking for an Analyst / Senior Analyst who works in the analytics domain with a strong python background.
Desired Skills, Competencies & Experience:
• • 2-4 years of experience in working in the analytics domain with a strong python background. • • Visualization skills in python with plotly, matplotlib, seaborn etc. Ability to create customized plots using such tools. • • Ability to write effective, scalable and modular code. Should be able to understand, test and debug existing python project modules quickly and contribute to that. • • Should be familiarized with Git workflows.
Good to Have: • • Familiarity with cloud platforms like AWS, AzureML, Databricks, GCP etc. • • Understanding of shell scripting, python package development. • • Experienced with Python data science packages like Pandas, numpy, sklearn etc. • • ML model building and evaluation experience using sklearn.
|
Aikon Labs Pvt Ltd is a start-up focused on Realizing Ideas. One such idea is iEngage.io , our Intelligent Engagement Platform. We leverage Augmented Intelligence, a combination of machine-driven insights & human understanding, to serve a timely response to every interaction from the people you care about.
Get in touch If you are interested.
Do you have a passion to be a part of an innovative startup? Here’s an opportunity for you - become an active member of our core platform development team.
Main Duties
● Quickly research the latest innovations in Machine Learning, especially with respect to
Natural Language Understanding & implement them if useful
● Train models to provide different insights, mainly from text but also other media such as Audio and Video
● Validate the models trained. Fine-tune & optimise as necessary
● Deploy validated models, wrapped in a Flask server as a REST API or containerize in docker containers
● Build preprocessing pipelines for the models that are bieng served as a REST API
● Periodically, test & validate models in use. Update where necessary
Role & Relationships
We consider ourselves a team & you will be a valuable part of it. You could be reporting to a Senior member or directly to our Founder, CEO
Educational Qualifications
We don’t discriminate. As long as you have the required skill set & the right attitude
Experience
Upto two years of experience, preferably working on ML. Freshers are welcome too!
Skills
Good
● Strong understanding of Java / Python
● Clarity on concepts of Data Science
● A strong grounding in core Machine Learning
● Ability to wrangle & manipulate data into a processable form
● Knowledge of web technologies like Web server (Flask, Django etc), REST API's
Even better
● Experience with deep learning
● Experience with frameworks like Scikit-Learn, Tensorflow, Pytorch, Keras
Competencies
● Knowledge of NLP libraries such as NLTK, spacy, gensim.
● Knowledge of NLP models such as Wod2vec, Glove, ELMO, Fasttext
● An aptitude to solve problems & learn something new
● Highly self-motivated
● Analytical frame of mind
● Ability to work in fast-paced, dynamic environment
Location
Pune
Remuneration
Once we meet, we shall make an offer depending on how good a fit you are & the experience you already have
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow