Cutshort logo
Yulu Bikes logo
Data Analyst
Yulu Bikes's logo

Data Analyst

Keerthana k's profile picture
Posted by Keerthana k
1 - 2 yrs
₹7L - ₹12L / yr
Bengaluru (Bangalore)
Skills
skill iconData Science
skill iconData Analytics
SQL
skill iconPython
Datawarehousing
Big Data
Geospatial analysis
Skill Set 
SQL, Python, Numpy,Pandas,Knowledge of Hive and Data warehousing concept will be a plus point.

JD 

- Strong analytical skills with the ability to collect, organise, analyse and interpret trends or patterns in complex data sets and provide reports & visualisations.

- Work with management to prioritise business KPIs and information needs Locate and define new process improvement opportunities.

- Technical expertise with data models, database design and development, data mining and segmentation techniques

- Proven success in a collaborative, team-oriented environment

- Working experience with geospatial data will be a plus.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Yulu Bikes

Founded :
2017
Type
Size :
20-100
Stage :
Raised funding
About
Yulu is the leading shared micro-mobility service provider using electric two-wheelers to reduce traffic congestion and air pollution in urban India. Yulu's mission is to create sustainable cities of tomorrow by revolutionizing the way people commute. Yulu’s technology-driven mobility platform uses IoT, ML, and AI for demand-supply management and efficient operations. Currently, Yulu is present in Bengaluru, New Delhi, Pune, Greater Mumbai, and Bhubaneswar. Yulu is committed to building the largest EV ecosystem of India to provide smart, shared and sustainable mobility solutions for the urban commute.
Read more
Company video
Yulu Bikes's video section
Yulu Bikes's video section
Photos
Company featured pictures
Company featured pictures
Company featured pictures
Connect with the team
Profile picture
Shashikant MS
Profile picture
Keerthana k
Profile picture
Varigonda Kavyasree
Company social profiles
N/A

Similar jobs

DeepIntent
at DeepIntent
2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
4 - 8 yrs
Best in industry
Data Warehouse (DWH)
Informatica
ETL
SQL
Google Cloud Platform (GCP)
+3 more

Who We Are:

DeepIntent is leading the healthcare advertising industry with data-driven solutions built for the future. From day one, our mission has been to improve patient outcomes through the artful use of advertising, data science, and real-world clinical data.

What You’ll Do:

We are looking for a Senior Software Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.  

This role will be in the Analytics Organization and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.  

  • Serve as the Engineering interface between Analytics and Engineering teams
  • Develop and standardized all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data based decisioning
  • Optimize queries and data access efficiencies, serve as expert in how to most efficiently attain desired data points
  • Build “mastered” versions of the data for Analytics specific querying use cases
  • Help with data ETL, table performance optimization
  • Establish formal data practice for the Analytics practice in conjunction with rest of DeepIntent
  • Build & operate scalable and robust data architectures
  • Interpret analytics methodology requirements and apply to data architecture to create standardized queries and operations for use by analytics teams
  • Implement DataOps practices
  • Master existing and new Data Pipelines and develop appropriate queries to meet analytics specific objectives
  • Collaborate with various business stakeholders, software engineers, machine learning engineers, analysts
  • Operate between Engineers and Analysts to unify both practices for analytics insight creation

Who You Are:

  • Adept in market research methodologies and using data to deliver representative insights
  • Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases
  • Deep SQL experience is a must
  • Exceptional communication skills with ability to collaborate and translate with between technical and non technical needs
  • English Language Fluency and proven success working with teams in the U.S.
  • Experience in designing, developing and operating configurable Data pipelines serving high volume and velocity data
  • Experience working with public clouds like GCP/AWS
  • Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies
  • Experience building Data architectures that optimize performance and cost, whether the components are prepackaged or homegrown
  • Proficient with SQL,Python or JVM based language, Bash
  • Experience with any of Apache open source projects such as Spark, Druid, Beam, Airflow etc.and big data databases like BigQuery, Clickhouse, etc         
  • Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious
  • Comfortable to work in EST Time Zone


Read more
Intellikart Ventures LLP
Prajwal Shinde
Posted by Prajwal Shinde
Hyderabad
6 - 11 yrs
₹7L - ₹15L / yr
skill iconPython
SQL
Relational Database (RDBMS)
NOSQL Databases
SQL Azure
+3 more

·       Honest, transparent Team player and go-getter.

·       Friendly and positive work culture that's based on respect for individual.

·       Work in large-scale projects with onshore and offshore models.

·       Health care knowledge at least 6+ years of experience.

·       Knowledge of Health care Data domains and supported Data initiatives, like Data Governance, Data modelling and build data driven use cases.

·       Knowledge in understanding Source to Target mapping

·       Work well with Optimizers and help on performance tuning, with SQL server and Azure objects.

·       Has a can-do attitude and has used cutting-edge technologies in real projects. 

·       Support and perform daily tasks around SQL server and ETL.

·       Ability to integrate with Azure and ADF

·       Very process focused and had good knowledge in the Error/Exception handling.

·       Ability to code with modifications history and ability to do through unit test before passing to

·       Ability to review code, trustworthy and is a leader in finishing ETL processes from Start to end without errors.

·       Work in a environment with pair-programming and Agile.

 

Key responsibilities:

 

·       Develop and maintain databases in Azure environment.

·       Build ETL processes with Azure Data Factory, Azure Databricks, Data queries

·       Perform migration processes to or from Azure environment​.

·       Azure Data Engineering: Azure Data Factory, Azure Data Lake, Azure Databricks, Azure Synapse​

·       Developing using Python/Scala and SQL​ server components

·       Work with SQL server on a daily basis and problem solving.

·       Work with RDBMS and NoSQL​

·       Work in the team following Agile practices​.

·       Support exploration and sales of new market trends to ensure growth with interesting projects that leverage the latest technologies​

·       Actively participate in building specialty group skills together with others

Read more
Mobile Programming LLC
at Mobile Programming LLC
1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
Snow flake schema
Snowflake
+5 more

Job Title: AWS-Azure Data Engineer with Snowflake

Location: Bangalore, India

Experience: 4+ years

Budget: 15 to 20 LPA

Notice Period: Immediate joiners or less than 15 days

Job Description:

We are seeking an experienced AWS-Azure Data Engineer with expertise in Snowflake to join our team in Bangalore. As a Data Engineer, you will be responsible for designing, implementing, and maintaining data infrastructure and systems using AWS, Azure, and Snowflake. Your primary focus will be on developing scalable and efficient data pipelines, optimizing data storage and processing, and ensuring the availability and reliability of data for analysis and reporting.

Responsibilities:

  1. Design, develop, and maintain data pipelines on AWS and Azure to ingest, process, and transform data from various sources.
  2. Optimize data storage and processing using cloud-native services and technologies such as AWS S3, AWS Glue, Azure Data Lake Storage, Azure Data Factory, etc.
  3. Implement and manage data warehouse solutions using Snowflake, including schema design, query optimization, and performance tuning.
  4. Collaborate with cross-functional teams to understand data requirements and translate them into scalable and efficient technical solutions.
  5. Ensure data quality and integrity by implementing data validation, cleansing, and transformation processes.
  6. Develop and maintain ETL processes for data integration and migration between different data sources and platforms.
  7. Implement and enforce data governance and security practices, including access control, encryption, and compliance with regulations.
  8. Collaborate with data scientists and analysts to support their data needs and enable advanced analytics and machine learning initiatives.
  9. Monitor and troubleshoot data pipelines and systems to identify and resolve performance issues or data inconsistencies.
  10. Stay updated with the latest advancements in cloud technologies, data engineering best practices, and emerging trends in the industry.

Requirements:

  1. Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
  2. Minimum of 4 years of experience as a Data Engineer, with a focus on AWS, Azure, and Snowflake.
  3. Strong proficiency in data modelling, ETL development, and data integration.
  4. Expertise in cloud platforms such as AWS and Azure, including hands-on experience with data storage and processing services.
  5. In-depth knowledge of Snowflake, including schema design, SQL optimization, and performance tuning.
  6. Experience with scripting languages such as Python or Java for data manipulation and automation tasks.
  7. Familiarity with data governance principles and security best practices.
  8. Strong problem-solving skills and ability to work independently in a fast-paced environment.
  9. Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders.
  10. Immediate joiner or notice period less than 15 days preferred.

If you possess the required skills and are passionate about leveraging AWS, Azure, and Snowflake to build scalable data solutions, we invite you to apply. Please submit your resume and a cover letter highlighting your relevant experience and achievements in the AWS, Azure, and Snowflake domains.

Read more
GroundtRuth
at GroundtRuth
2 recruiters
Priti Singh
Posted by Priti Singh
Remote only
7 - 12 yrs
₹15L - ₹32L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+3 more

You will:

  • Create highly scalable AWS micro-services utilizing cutting edge cloud technologies.
  • Design and develop Big Data pipelines handling huge geospatial data.
  • Bring clarity to large complex technical challenges.
  • Collaborate with Engineering leadership to help drive technical strategy.
  • Project scoping, planning and estimation.
  • Mentor and coach team members at different levels of experience.
  • Participate in peer code reviews and technical meetings.
  • Cultivate a culture of engineering excellence.
  • Seek, implement and adhere to standards, frameworks and best practices in the industry.
  • Participate in on-call rotation.

You have:

  • Bachelor’s/Master’s degree in computer science, computer engineering or relevant field.
  • 5+ years of experience in software design, architecture and development.
  • 5+ years of experience using object-oriented languages (Java, Python).
  • Strong experience with Big Data technologies like Hadoop, Spark, Map Reduce, Kafka, etc.
  • Strong experience in working with different AWS technologies.
  • Excellent competencies in data structures & algorithms.

Nice to have:

  • Proven track record of delivering large scale projects, and an ability to break down large tasks into smaller deliverable chunks
  • Experience in developing high throughput low latency backend services
  • Affinity to spatial data structures and algorithms.
  • Familiarity with Postgres DB, Google Places or Mapbox APIs

What we offer

At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love.

  • Unlimited Paid Time Off
  • In Office Daily Catered Lunch
  • Fully stocked snacks/beverages
  • 401(k) employer match
  • Health coverage including medical, dental, vision and option for HSA or FSA
  • Generous parental leave
  • Company-wide DEIB Committee
  • Inclusion Academy Seminars
  • Wellness/Gym Reimbursement
  • Pet Expense Reimbursement
  • Company-wide Volunteer Day
  • Education reimbursement program
  • Cell phone reimbursement
  • Equity Analysis to ensure fair pay
Read more
Disruptive Fintech Startup
Agency job
via Unnati by Sarika Tamhane
Bengaluru (Bangalore)
3 - 7 yrs
₹8L - ₹12L / yr
skill iconData Science
skill iconData Analytics
skill iconR Programming
skill iconPython
Investment analysis
+1 more
If you are interested in joining a purpose-driven community that is dedicated to creating ambitious and inclusive workplaces, then be a part of a high growth startup with a world-class team, building a revolutionary product!
 
Our client is a vertical fintech play focused on solving industry-specific financing gaps in the food sector through the application of data. The platform provides skin-in-the-game growth capital to much-loved F&B brands. Founded in 2019, they’re VC funded and based out of Singapore and India-Bangalore.
 
Founders are the alumnus of IIT-D, IIM-B and Wharton. They’ve 12+ years of experience as Venture capital and corporate entrepreneurship at DFJ, Vertex, InMobi and VP at Snyder UAE, investment banking at Unitus Capital - leading the financial services practice, and institutional equities at Kotak. They’ve a team of high-quality professionals coming together for this mission to disrupt the convention.
 
 
AsData Scientist, you will develop a first of its kind risk engine for revenue-based financing in India and automating investment appraisals for the company's different revenue-based financing products

What you will do:
 
  • Identifying alternate data sources beyond financial statements and implementing them as a part of assessment criteria
  • Automating appraisal mechanisms for all newly launched products and revisiting the same for an existing product
  • Back-testing investment appraisal models at regular intervals to improve the same
  • Complementing appraisals with portfolio data analysis and portfolio monitoring at regular intervals
  • Working closely with the business and the technology team to ensure the portfolio is performing as per internal benchmarks and that relevant checks are put in place at various stages of the investment lifecycle
  • Identifying relevant sub-sector criteria to score and rate investment opportunities internally

 


Candidate Profile:

 

Desired Candidate Profile

 

What you need to have:
 
  • Bachelor’s degree with relevant work experience of at least 3 years with CA/MBA (mandatory)
  • Experience in working in lending/investing fintech (mandatory)
  • Strong Excel skills (mandatory)
  • Previous experience in credit rating or credit scoring or investment analysis (preferred)
  • Prior exposure to working on data-led models on payment gateways or accounting systems (preferred)
  • Proficiency in data analysis (preferred)
  • Good verbal and written skills
Read more
Pune
5 - 8 yrs
₹1L - ₹15L / yr
Informatica
Informatica PowerCenter
Spark
Hadoop
Big Data
+6 more

Technical/Core skills

  1. Minimum 3 yrs of exp in Informatica Big data Developer(BDM) in Hadoop environment.
  2. Have knowledge of informatica Power exchange (PWX).
  3. Minimum 3 yrs of exp in big data querying tool like Hive and Impala.
  4. Ability to designing/development of complex mappings using informatica Big data Developer.
  5. Create and manage Informatica power exchange and CDC real time implementation
  6. Strong Unix knowledge skills for writing shell scripts and troubleshoot of existing scripts.
  7. Good knowledge of big data platforms and its framework.
  8. Good to have an experience in cloudera data platform (CDP)
  9. Experience with building stream processing systems using Kafka and spark
  10. Excellent SQL knowledge

 

Soft skills :

  1. Ability to work independently 
  2. Strong analytical and problem solving skills
  3. Attitude of learning new technology
  4. Regular interaction with vendors, partners and stakeholders
Read more
Busigence Technologies
at Busigence Technologies
1 video
1 recruiter
Seema Verma
Posted by Seema Verma
Bengaluru (Bangalore)
0 - 10 yrs
₹2L - ₹9L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconDeep Learning
Analytics
Statistical Analysis
+3 more
APPLY LINK: https://bit.ly/2KagS3n Go through the entire job post and search for role relevant to your background. Applicant must mandatorily meet Real You requirements for that role, otherwise application shall be straightforwardly rejected. ````````````````````````````````````````````````````````````````````````````````````````````````````` Busigence is a Decision Intelligence Company. We create decision intelligence products for real people by combining data, technology, business, and behavior enabling strengthened decisions. Scaling established startup by IIT alumni innovating & disrupting marketing domain through artificial intelligence. We bring those people onboard who are dedicated to deliver wisdom to humanity by solving the world’s most pressing problems differently thereby significantly impacting thousands of souls, everyday. We are a deep rooted organization with six years of success story having worked with folks from top tier background (IIT, NSIT, DCE, BITS, IIITs, NITs, IIMs, ISI etc.) maintaining an awesome culture with a common vision to build great data products. In past we have served fifty five customers and presently developing our second product, Robonate. First was emmoQ - an emotion intelligence platform. Third offering, H2HData, an innovation lab where we solve hard problems through data, science, & design. We work extensively & intensely on big data, data science, machine learning, deep learning, reinforcement learning, data analytics, natural language processing, cognitive computing, and business intelligence. First-and-Foremost Before you dive-in exploring this opportunity and press Apply, we wish you to evaluate yourself - We are looking for right candidate, not the best candidate. We love to work with someone who can mandatorily gel with our vision, beliefs, thoughts, methods, and values --- which are aligned with what can be expected in a true startup with ambitious goals. Skills are always secondary to us. Primarily, you must be someone who is not essentially looking for a job or career, rather starving for a challenge, you yourself probably don't know since when. A book can be written on what an applicant must have before joining a . For brevity, in nutshell, we need these three in you: 1. You must be [super sharp] (Just an analogue, but Irodov, Mensa, Feynman, Polya, ACM, NIPS, ICAAC, BattleCode, DOTA etc should have been your Done stuff. Can you relate solution 1 to problem 2? or Do you get confused even when solved similar problem in past? Are you able to grasp problem statement in one go? or get hanged?) 2. You must be [extremely energetic] (Do you raise eyebrows when asked to stretch your limits, both in terms of complexity or extra hours to put in? What comes first in your mind, let's finish it today or this can be done tomorrow too? Its Friday 10 PM at work -Tired?) 3. You must be [honourably honest] (Do you tell others what you think, or what they want to hear? Later is good for sales team for their customers, not for this role. Are you honest with your work? intrinsically with yourself first?) You know yourself the best. If not ask your loved ones and then decide. We clearly need exceedingly motivated people with entrepreneurial traits, not employee mindset - not at all. This is an immediate requirement. We shall have an accelerated interview process for fast closure - you would be required to be proactive and responsive. ********************************************************************************* ROLE 1 Intern - Data Science & ML Team: Sciences Full-Time, Temporary Bangaluru, India Relevant Exp: minus 1 - 10 Years Background: Top Tier institute Compensation: INR 20K - 60K/month Real ROLE We are looking for students, graduates, and experienced folks with real passion for algorithms, computing, and analysis. You would be required to work with our sciences team on complex cases from data science, machine learning, and business analytics. Mandatory R1. Must know in-and-out of functional programming (https://docs.python.org/2/howto/functional.html) in Python with strong flair for data structures, linear algebra, & algorithms implementation. Only oops cannot not be accepted. R2. Must have soiled hands on methods, functions, and workarounds in NumPy, Pandas, Scikit-learn, SciPy, Stasmodels - collectively you should have implemented atleast 75 different techniques (we averaged out this figure with our past interns who have worked on this role) R3. Must have implemented complex mathematical logics through functional map-reduce framework in Python R4. Must have understanding on EDA cycle, machine learning algorithms, hyper-parameter optimization, ensemble learning, regularization, predictions, clustering, associations - at essential level R5. Must have solved atleast five problems through data science & machine learning. Mere coursera learning and/or Kaggle offline attempts shall not be accepted Preferred R6. Good to have required callibre to grasp underlying business for a problem to be solved ******************************************************************************* ******************************************************************************* ROLE 2 Intern - Deep Learning & AI Team: Sciences Full-Time, Temporary Bangaluru, India Relevant Exp: minus 1 - 10 Years Background: Top Tier institute Compensation: INR 25K - 75K/month Real ROLE We are looking for students, graduates, and experienced folks with real passion for algorithms, neural networks, and unstructured data analysis. You would be required to work with our sciences team on complex cases from language processing, audio analytics, and image classification. Mandatory R1. Must know in-and-out of functional programming (https://docs.python.org/2/howto/functional.html) in Python with strong flair for data structures, linear algebra, & algorithms implementation. Only oops cannot not be accepted. R2. Must have soiled hands on methods, functions, and workarounds in tensorflow - collectively you should have implemented atleast 50 different techniques (we averaged out this figure with our past interns who have worked on this role) R3. Must have implemented complex mathematical logics through functional map-reduce framework in Python R4. Must have understanding on CNNs, RNNs, MLP, Auto-Encoders - at essential level R5. Must have solved atleast three problems through deep learning. Mere coursera learning and/or Kaggle offline attempts shall not be accepted Preferred R6. Good to have worked on pre-processing techniques for images, audio, and text - OpenCV, Librosa, NLTK R7. Good to have used pre-trained models - VGGNET, Inception, ResNet, WaveNet, Word2Vec R8. Good to have required callibre to grasp underlying business for a problem to be solved ******************************************************************************* ******************************************************************************* ROLE 3 Intern - Web Development Team: Engineering Full-Time, Temporary Bangaluru, India Relevant Exp: minus 1 - 10 Years Background: Top Tier institute Compensation: INR 15K - 45K/month Real ROLE We are looking for students, graduates, and experienced folks with real passion for web development. You would be required to work with our engagement team on websites/web applications development. Mandatory R1. Must have working knowledge on front end (CSS, HTML, Javascript, JQuery) and Backend (MYSQL). Theoretical understanding and/or academic projects shall not be accepted R2. Must have designed & developed atleast three real-world web portals & applications Preferred R3. Good to have worked on wordpress with/without pre-built themes/ templates ******************************************************************************* Ideal YOU Y1. Studying or graduated in engineering, or any other data-heavy field at Bachelors level or above from a top tier institute Y2. Relevant experience of 0.25 - 3 years working on real-world problems in a reputed company or a proven startup Y3. You are a fanatical implementer who love to spend time with codes & workarounds, more than your loved ones Y4. You are true believer that human intelligence can be augmented through computer science & mathematics and your survival vinaigrette depends on getting the most from the data Y5. You are an entrepreneur mindset with ownership, intellectuality, & creativity as way to work. These are not fancy words, we mean it Actual WE W1. Real startup with Meaningful products W2. Revolutionary not just disruptive W3. Rules creators not followers W4. Small teams with real brains not herd of blockheads W5. Completely trust us and should be trusted back Why Us In addition to the regular stuff which every good startup offers – Lots of learning, Food, Parties, Open culture, Flexible working hours, and what not…. We offer you: You shall be working on our revolutionary products which are pioneer in their respective categories. This is a fact. We try real hard to hire fun loving crazy folks who are driven by more than a paycheck. You shall be working with creamiest talent on extremely challenging problems at most happening workplace. How to Apply You should apply online by clicking "Apply Now". For queries regarding an open position, please write to [email protected] For more information, visit http://www.busigence.com Careers: http://careers.busigence.com Research: http://research.busigence.com Jobs: http://careers.busigence.com/jobs/data-science Feel right fit for the position, mandatorily attach PDF resume highlighting your A. Key Skills B. Knowledge Inputs C. Major Accomplishments D. Problems Solved E. Submissions – Github/ StackOverflow/ Kaggle/ Euler Project etc. (if applicable) If you don't see this open position that interests you, join our Talent Pool and let us know how you can make a difference here. Referrals are more than welcome. Keep us in loop.
Read more
FMCG sector
Agency job
via CIEL HR Services by Riya Tharakan
Remote only
9 - 16 yrs
₹20L - ₹30L / yr
skill iconData Analytics
Advanced analytics
Data mining
Statistical Modeling
Analytics

Role: Data Analytics Lead / Manager

  1. Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities.
  2. Build, develop and maintain data models, reporting systems, data automation systems, dashboards and performance metrics support that support key business decisions.
  3. Coordinate with different teams to determine requirements for data warehousing, reporting and analytical solutions, and ensure customer satisfaction on deliverables
  4. Oversee analytics projects to extract, manage, and analyse data from multiple applications, ensuring that deadlines are met.
  5. Apply statistics and data modelling to gain actionable business insights and boost customer productivity and revenue.
  6. Enforce company policies and procedures to ensure quality and prevent discrepancies.
  7. Communicate and track key performance metrics across departments.
  8. Keep abreast of industry best practices and policies.
  9. Research Latest trends, analyse data, identify opportunities and incorporate changes into business strategies.
  10. Manage and optimize processes for data intake, validation, mining and engineering as well as modelling, visualization and communication deliverables.
  11. Examine, interpret and report results of analytical initiatives to stakeholders in leadership, technology, sales, marketing and product teams.
  12. Oversee the data/report requests process: tracking requests submitted, prioritization, approval, etc.
  13. Develop and implement MLDevOps, quality controls and departmental standards to ensure quality standards, organizational expectations, and regulatory requirements.
  14. Anticipate future demands of initiatives related to people, technology, budget and business within your department and design/implement solutions to meet these needs.
  15. Organize and drive successful completion of data insight initiatives through effective management of data analyst and effective collaboration with stakeholders.

Essential Skills

  1. Working knowledge of data mining principles: predictive analytics, mapping, collecting data from multiple data systems on premises and cloud-based data sources.
  2. Strong SQL skills, ability to perform effective querying involving multiple tables and subqueries.
  3. Understanding of and experience using analytical concepts and statistical techniques: hypothesis development, designing tests/experiments, analyzing data, drawing conclusions, and developing actionable recommendations for business units.
  4. Experience and knowledge of statistical modelling techniques: multiple regression, logistic regression, log-linear regression, variable selection, etc.

 We’re looking for someone with at least 10+ years of experience in a position monitoring, managing, transforming and drawing insights from data, and someone with at least 5 years of experience leading a data analyst team.

Experience in Azure is preferred

Read more
Artivatic
at Artivatic
1 video
3 recruiters
Layak Singh
Posted by Layak Singh
Bengaluru (Bangalore)
3 - 7 yrs
₹6L - ₹14L / yr
skill iconPython
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
skill iconDeep Learning
Natural Language Processing (NLP)
+3 more
About Artivatic : Artivatic is a technology startup that uses AI/ML/Deep learning to build intelligent products & solutions for finance, healthcare & insurance businesses. It is based out of Bangalore with 25+ team focus on technology. The artivatic building is cutting edge solutions to enable 750 Millions plus people to get insurance, financial access, and health benefits with alternative data sources to increase their productivity, efficiency, automation power, and profitability, hence improving their way of doing business more intelligently & seamlessly.  - Artivatic offers lending underwriting, credit/insurance underwriting, fraud, prediction, personalization, recommendation, risk profiling, consumer profiling intelligence, KYC Automation & Compliance, healthcare, automated decisions, monitoring, claims processing, sentiment/psychology behaviour, auto insurance claims, travel insurance, disease prediction for insurance and more.   Job description We at artivatic are seeking for passionate, talented and research focused natural processing language engineer with strong machine learning and mathematics background to help build industry-leading technology. The ideal candidate will have research/implementation experience in modeling and developing NLP tools and have experience working with machine learning/deep learning algorithms. Roles and responsibilities Developing novel algorithms and modeling techniques to advance the state of the art in Natural Language Processing. Developing NLP based tools and solutions end to end. Working closely with R&D and Machine Learning engineers implementing algorithms that power user and developer-facing products.Be responsible for measuring and optimizing the quality of your algorithms Requirements Hands-on Experience building NLP models using different NLP libraries ad toolkit like NLTK, Stanford NLP etc Good understanding of Rule-based, Statistical and probabilistic NLP techniques. Good knowledge of NLP approaches and concepts like topic modeling, text summarization, semantic modeling, Named Entity recognition etc. Good understanding of Machine learning and Deep learning algorithms. Good knowledge of Data Structures and Algorithms. Strong programming skills in Python/Java/Scala/C/C++. Strong problem solving and logical skills. A go-getter kind of attitude with the willingness to learn new technologies. Well versed in software design paradigms and good development practices. Basic Qualifications Bachelors or Master degree in Computer Science, Mathematics or related field with specialization in natural language - Processing, Machine Learning or Deep Learning. Publication record in conferences/journals is a plus. 2+ years of working/research experience building NLP based solutions is preferred. If you feel that you are the ideal candidate & can bring a lot of values to our culture & company's vision, then please do apply. If your profile matches as per our requirements, you will hear from one of our team members. We are looking for someone who can be part of our Team not Employee. Job Perks Insurance, Travel compensation & others
Read more
OpexAI
at OpexAI
1 recruiter
Jasmine Shaik
Posted by Jasmine Shaik
Hyderabad
0 - 1 yrs
₹1L - ₹1L / yr
skill iconData Science
skill iconR Programming
skill iconPython
TensorFlow
freshers of Bigdata, Data scientist, Computer vision of their skills
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos