Work shift: Day time
- Strong problem-solving skills with an emphasis on product development.
insights from large data sets.
• Experience in building ML pipelines with Apache Spark, Python
• Proficiency in implementing end to end Data Science Life cycle
• Experience in Model fine-tuning and advanced grid search techniques
• Experience working with and creating data architectures.
• Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
• Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests and proper usage, etc.) and experience with applications.
• Excellent written and verbal communication skills for coordinating across teams.
• A drive to learn and master new technologies and techniques.
• Assess the effectiveness and accuracy of new data sources and data gathering techniques.
• Develop custom data models and algorithms to apply to data sets.
• Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes.
• Develop company A/B testing framework and test model quality.
• Coordinate with different functional teams to implement models and monitor outcomes.
• Develop processes and tools to monitor and analyze model performance and data accuracy.
● Strong knowledge in Data Science pipelines with Python
● Object-oriented programming
● A/B testing framework and model fine-tuning
● Proficiency in using sci-kit, NumPy, and pandas package in python
Nice to have:
● Ability to work with containerized solutions: Docker/Compose/Swarm/Kubernetes
● Unit testing, Test-driven development practice
● DevOps, Continuous integration/ continuous deployment experience
● Agile development environment experience, familiarity with SCRUM
● Deep learning knowledge
The ideal candidate will use their passion for big data and analytics to provide insights into the business covering a range of topics. They will be responsible for conducting both recurring and ad hoc analyses for business users.
Roles and Responsibilities
- Perform data analysis on large volumes of data to identify trends and/or data processing rules.
- Team player of the core analytics team.
- Responsible for weekly and monthly Sales/Marketing Reports on a Gross and Net basis and other Adhoc reports.
- Analyze the data to come out with insights on what leads to better conversions, student preferences, role in various investments and channels, optimizing the spending etc.
- Prepare reports and dashboard for various business functions to keep track of important business
- Make and Execute business decisions with the help of Data and analysis the person is doing
Desired Candidate Profile
- Candidate with 1 to 5 years of work experience
- Good hands-on Advanced Excel & SQL.
- Has extensively worked on live Dashboards, reporting, data manipulation and making flat tables in SQL.
- Knowledge of Python/R.
- Strong analytical skills and ability to interpret data.
- Natural curiosity and self-drive to understand the broader business in order to provide the appropriate reporting support.
- Extremely high ownership, self-starter and work in a constantly changing and fast-growing environment.
- Establish collaborative and trusting relationships with the business's key internal leaders and stakeholders in order to ensure that there is a free flow of ideas and information across the business.
- First principle thinking and strong problem solving
- Good experience on Power BI Visualizations, DAX queries in Power BI
- Experience in implementing Row Level Security
- Can understand data models, can implement simple-medium data models
- Quick learner to pick up the Application data design and processe
- Expert in SQL, Analyze current ETL/SSIS process
- Hands on experience in data modeling
- Quick learner to pick up the Application data design and processes
- Data warehouse development and work with SSIS & SSAS (Good to have)
Essential Duties and Responsibilities:
- Build data systems and pipelines
- Prepare data for ML modeling
- Combine raw information from different sources
- Conduct complex data analysis and report on results
- Build data systems and pipelines.
Work Experience :
- 3 years of experience working with Node, AI/ML & Data Transformation Tools
- Hands on experience with ETL & Data Visualization tools
- Familiarity with Python (Numpy, Pandas)
- Experience with SQL & NoSQL DBs
Must Have : Python, Data warehouse tool , ETL, SQL/MongoDB, Data modeling, Data transformation, Data visualization
Nice to have: MongoDB/ SQL, Snowflake, Matillion, Node.JS, ML model building
- B.E Computer Science or equivalent.
- In-depth knowledge of machine learning algorithms and their applications including
practical experience with and theoretical understanding of algorithms for classification,
regression and clustering.
- Hands-on experience in computer vision and deep learning projects to solve real world
problems involving vision tasks such as object detection, Object tracking, instance
segmentation, activity detection, depth estimation, optical flow, multi-view geometry,
domain adaptation etc.
- Strong understanding of modern and traditional Computer Vision Algorithms.
- Experience in one of the Deep Learning Frameworks / Networks: PyTorch, TensorFlow,
Darknet (YOLO v4 v5), U-Net, Mask R-CNN, EfficientDet, BERT etc.
- Proficiency with CNN architectures such as ResNet, VGG, UNet, MobileNet, pix2pix,
and Cycle GAN.
- Experienced user of libraries such as OpenCV, scikit-learn, matplotlib and pandas.
- Ability to transform research articles into working solutions to solve real-world problems.
- High proficiency in Python programming knowledge.
- Familiar with software development practices/pipelines (DevOps- Kubernetes, docker
containers, CI/CD tools).
- Strong communication skills.
"Slack for Construction"
Early stage startup cofounded by IIT - Roorkee alumnis. A Mobile-based operating system to manage construction & architectural projects. Material, all the info is shared over whatsapp, mobile app to manage all this in one single place - almost like a slack tool for construction.Mobile app + SAAS platform - administration and management of the process, 150000 users, subscription based pricing.It helps construction project owners and contractors track on-site progress in real-time to finish projects on time and in budget. We aim to bring the speed of software development to infrastructure development.Founded by IIT Roorkee alumni and backed by industry experts, we are on a mission to help the second largest industry in India-Construction make a transition from pen and paper to digital.
About the team
As a productivity app startup, we value productivity and ownership most. That helps raise our own bar and the bar of people we hire.We follow agile and scrum approaches for product development and use best of class tools and practices. Measuring our progress on a weekly basis and iterating fast enables us to build breakthrough modules and features rapidly.If you join us, You will be constantly thrown into challenging situations. Decisions that you take, will directly impact our clients and sales. That's how we learn.
- Prior experience in any data driven decision making field.
- Working knowledge of querying data using SQL.
- Familiarity with customer and business data analytic tools like Segment, Mix-panel, Google Analytics, SmartLook etc.
- Data visualisation tools like Tableau, Power BI, etc.
"All things data"
- Ability to synthesize complex data into actionable goals.
- Critical thinking skills to recommend original and productive ideas
- Ability to visualise user stories and create user funnels
- Perform user test sessions and market surveys to inform product development teams
- Excellent writing skills to prepare detailed product specification and analytic reports
- Help define Product strategy / Roadmaps with scalable architecture
- Interpersonal skills to work collaboratively with various stakeholders who may have competing interests
We are looking for a Machine Learning engineer for on of our premium client.
Experience: 2-9 years
Python, PySpark, the Python Scientific Stack; MLFlow, Grafana, Prometheus for machine learning pipeline management and monitoring; SQL, Airflow, Databricks, our own open-source data pipelining framework called Kedro, Dask/RAPIDS; Django, GraphQL and ReactJS for horizontal product development; container technologies such as Docker and Kubernetes, CircleCI/Jenkins for CI/CD, cloud solutions such as AWS, GCP, and Azure as well as Terraform and Cloudformation for deployment
- Improve robustness of Leena AI current NLP stack
- Increase zero shot learning capability of Leena AI current NLP stack
- Opportunity to add/build new NLP architectures based on requirements
- Manage End to End lifecycle of the data in the system till it achieves more than 90% accuracy
- Manage a NLP team
- Strong understanding of linear algebra, optimisation, probability, statistics
- Experience in the data science methodology from exploratory data analysis, feature engineering, model selection, deployment of the model at scale and model evaluation
- Experience in deploying NLP architectures in production
- Understanding of latest NLP architectures like transformers is good to have
- Experience in adversarial attacks/robustness of DNN is good to have
- Experience with Python Web Framework (Django), Analytics and Machine Learning frameworks like Tensorflow/Keras/Pytorch.
- Adept at Machine learning techniques and algorithms.
Feature selection, dimensionality reduction, building and
- optimizing classifiers using machine learning techniques
- Data mining using state-of-the-art methods
- Doing ad-hoc analysis and presenting results
- Proficiency in using query languages such as N1QL, SQL
Experience with data visualization tools, such as D3.js, GGplot,
- Plotly, PyPlot, etc.
Creating automated anomaly detection systems and constant tracking
- of its performance
- Strong in Python is a must.
- Strong in Data Analysis and mining is a must
- Deep Learning, Neural Network, CNN, Image Processing (Must)
Building analytic systems - data collection, cleansing and
Experience with NoSQL databases, such as Couchbase, MongoDB,
make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.
- Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
- Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
- Become an expert on data and trends, both internal and external to Kaleidofin.
- Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
- Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
- Automate scheduling and distribution of reports and support auditing and value realization.
- Partner with enterprise architects to define and ensure proposed.
- Business Intelligence solutions adhere to an enterprise reference architecture.
- Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks
- Experience leading development efforts through all phases of SDLC.
- 5+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
- Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
- Hands on experience in SQL, data management, and scripting (preferably Python).
- Strong data visualisation design skills, data modeling and inference skills.
- Hands-on and experience in managing small teams.
- Financial services experience preferred, but not mandatory.
- Strong knowledge of architectural principles, tools, frameworks, and best practices.
- Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
- Team handling preferred for 5+yrs experience candidates.
- Notice period less than 30 days.