Cutshort logo
Data modeling Jobs in Pune

10+ Data modeling Jobs in Pune | Data modeling Job openings in Pune

Apply to 10+ Data modeling Jobs in Pune on CutShort.io. Explore the latest Data modeling Job opportunities across top companies like Google, Amazon & Adobe.

icon
Fictiv

at Fictiv

1 video
7 recruiters
Margaret Moses
Posted by Margaret Moses
Pune
5 - 12 yrs
Best in industry
Salesforce Apex
Salesforce Visualforce
Salesforce Lightning
Salesforce development
Data modeling

What’s in it for you?

Opportunity To Unlock Your Creativity

Think of all the times you were held back from trying new ideas because you were boxed in by bureaucratic legacy processes or old school tactics. Having a growth mindset is deeply ingrained into our company culture since day 1 so Fictiv is an environment where you have the creative liberty and support of the team to try big bold ideas to achieve our sales and customer goals.

Opportunity To Grow Your Career

At Fictiv, you'll be surrounded by supportive teammates who will push you to be your best through their curiosity and passion.

 

Opportunity To Unlock Your Creativity

Think of all the times you were held back from trying new ideas because you were boxed in by bureaucratic legacy processes or old school tactics. Having a growth mindset is deeply ingrained into our company culture since day 1 so Fictiv is an environment where you have the creative liberty and support of the team to try big bold ideas to achieve our sales and customer goals.

Impact In This Role

Excellent problem solving, decision-making and critical thinking skills.

Collaborative, a team player.

Excellent verbal and written communication skills.

Exhibits initiative, integrity and empathy.

Enjoy working with a diverse group of people in multiple regions.

Comfortable not knowing answers, but resourceful and able to resolve issues.

Self-starter; comfortable with ambiguity, asking questions and constantly learning.

Customer service mentality; advocates for another person's point of view.

Methodical and thorough in written documentation and communication.

Culture oriented; wants to work with people rather than in isolation.

You will report to the Director of IT Engineering

What You’ll Be Doing

  • Interface with Business Analysts and Stakeholders to understand & clarify requirements
  • Develop technical design for solutions Development
  • Implement high quality, scalable solutions following best practices, including configuration and code.
  • Deploy solutions and code using automated deployment tools
  • Take ownership of technical deliverables, ensure that quality work is completed, fully tested, delivered on time.
  • Conduct code reviews, optimization, and refactoring to minimize technical debt within Salesforce implementations.
  • Collaborate with cross-functional teams to integrate Salesforce with other systems and platforms, ensuring seamless data flow and system interoperability.
  • Identify opportunities for process improvements, mentor and support other developers/team members as needed.
  • Stay updated on new Salesforce features and functionalities and provide recommendations for process improvements.


Desired Traits

  • 8-10 years of experience in Salesforce development
  • Proven experience in developing Salesforce solutions with a deep understanding of Apex, Visualforce, Lightning Web Components, and Salesforce APIs.
  • Have worked in Salesforce CPQ, Sales/Manufacturing Cloud, Case Management
  • Experienced in designing and implementing custom solutions that align with business needs.
  • Strong knowledge of Salesforce data modeling, reporting, and database design.
  • Demonstrated experience in building and maintaining integrations between Salesforce and external applications.
  • Strong unit testing, functional testing and debugging skills
  • Strong understanding of best practices
  • Active Salesforce Certifications are desirable.
  • Experience in Mulesoft is a plus
  • Excellent communication skills and the ability to translate complex technical requirements into actionable solutions.

Interested in learning more? We look forward to hearing from you soon.

Read more
Pune
3 - 5 yrs
₹20L - ₹30L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
CI/CD
+12 more

As an engineer, you will help with the implementation, and launch of many key product features. You will get an opportunity to work on a wide range of technologies (including Spring, AWS Elastic Search, Lambda, ECS, Redis, Spark, Kafka etc.) and apply new technologies for solving problems. You will have an influence on defining product features, drive operational excellence, and spearhead the best practices that enable a quality product. You will get to work with skilled and motivated engineers who are already contributing to building high-scale and high-available systems.

If you are looking for an opportunity to work on leading technologies and would like to build product technology that can cater millions of customers inclined towards providing them the best experience, and relish large ownership and diverse technologies, join our team today!

 

What You'll Do:

  • Creating detailed design, working on development and performing code reviews.
  • Implementing validation and support activities in line with architecture requirements
  • Help the team translate the business requirements into R&D tasks and manage the roadmap of the R&D tasks.
  • Designing, building, and implementation of the product; participating in requirements elicitation, validation of architecture, creation and review of high and low level design, assigning and reviewing tasks for product implementation.
  • Work closely with product managers, UX designers and end users and integrating software components into a fully functional system
  • Ownership of product/feature end-to-end for all phases from the development to the production.
  • Ensuring the developed features are scalable and highly available with no quality concerns.
  • Work closely with senior engineers for refining the and implementation.
  • Management and execution against project plans and delivery commitments.
  • Assist directly and indirectly in the continual hiring and development of technical talent.
  • Create and execute appropriate quality plans, project plans, test strategies and processes for development activities in concert with business and project management efforts.

The ideal candidate is a passionate engineer about delivering experiences that delight customers and creating solutions that are robust. He/she should be able to commit and own the deliveries end-to-end.

 

 

What You'll Need:

 

  • A Bachelor's degree in Computer Science or related technical discipline.
  • 2-3+ years of Software Development experience with proficiency in Java or equivalent object-oriented languages, coupled with design and SOA
  • Fluency with Java, and Spring is good.
  • Experience in JEE applications and frameworks like struts, spring, mybatis, maven, gradle
  • Strong knowledge of Data Structures, Algorithms and CS fundamentals.
  • Experience in at least one shell scripting language, SQL, SQL Server, PostgreSQL and data modeling skills
  • Excellent analytical and reasoning skills
  • Ability to learn new domains and deliver output
  • Hands on Experience with the core AWS services
  • Experience working with CI/CD tools (Jenkins, Spinnaker, Nexus, GitLab, TeamCity, GoCD, etc.)

 

  • Expertise in at least one of the following:

    - Kafka, ZeroMQ, AWS SNS/SQS, or equivalent streaming technology

    - Distributed cache/in memory data grids like Redis, Hazelcast, Ignite, or Memcached

    - Distributed column store databases like Snowflake, Cassandra, or HBase

    - Spark, Flink, Beam, or equivalent streaming data processing frameworks

  • Proficient with writing and reviewing Python and other object-oriented language(s) are a plus
  • Experience building automations and CICD pipelines (integration, testing, deployment)
  • Experience with Kubernetes would be a plus.
  • Good understanding of working with distributed teams using Agile: Scrum, Kanban
  • Strong interpersonal skills as well as excellent written and verbal communication skills

• Attention to detail and quality, and the ability to work well in and across teams

Read more
Tredence
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
+1 more

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
EASEBUZZ

at EASEBUZZ

1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
Pune
5 - 8 yrs
₹11L - ₹30L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+14 more

 Sr. Java Software Engineer:

Preferred Education & Experience:

  • Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
  • Well-versed in and 5+ years of hands-on designing experience in Object Oriented Design, Data Modeling, Class & Object Modeling, Microservices Architecture & Design.
  • Well-versed in and 5+ years of hands-on programming experience in Core Java Programming, Advanced Java Programming, Spring Framework, Spring Boot or Micronaut Framework, Log Framework, Build & Deployment Framework, etc

. • 3+ years of hands-on experience developing Domain-Driven Microservices using libraries & frameworks such as Micronaut, Spring Boot, etc.

  • 3+ years of hands-on experience developing connector frameworks Apache Camel, Akka framework, etc.
  • 3+ years of hands-on experience in RBDMS & NoSQL Databases concepts and development practices (PostgreSQL, MongoDB, Elasticsearch, Amazon S3).
  • 3+ years of hands-on experience developing Webservices using REST, API Gateway using Token based authentication, access management.
  • 1+ years of hands-on experience developing and hosting microservices using Serverless and Container based development (AWS Lambda, Docker, Kubernetes, etc.).
  • Having Knowledge & hands-on experience developing applications using Behavior Driven Development, Test Driven Development Methodologies is a Plus.
  • Having Knowledge & hands-on experience in AWS Cloud Services such as IAM, Lambda, EC2, ECS, ECR, API Gateway, S3, SQS, Kinesis, CloudWatch, DynamoDB, etc. is also a Plus.
  • Having Knowledge & hands-on experience in DevOps CI/CD tools such as JIRA, Git (Bitbucket/GitHub), Artifactory, etc. & Build tools such as Maven & Gradle.
  • 2+ years of hands-on development experience in Java centric Developer Tools, Management & Governance, Networking and Content Delivery, Security, Identity, and Compliance, etc.
  • Having Knowledge & hands-on experience in Apache Nifi, Apache Spark, Apache Flink is also a Plus. • Having Knowledge & handson experience in Python, NodeJS, Scala Programming is also a Plus. Required Experience: 5+ Years

Job Location: Remote / Pune

Open Positions: 1

Read more
Mobile Programming India Pvt Ltd

at Mobile Programming India Pvt Ltd

1 video
17 recruiters
Inderjit Kaur
Posted by Inderjit Kaur
Bengaluru (Bangalore), Chennai, Pune, Gurugram
4 - 8 yrs
₹9L - ₹14L / yr
ETL
Data Warehouse (DWH)
Data engineering
Data modeling
BRIEF JOB RESPONSIBILITIES:

• Responsible for designing, deploying, and maintaining analytics environment that processes data at scale
• Contribute design, configuration, deployment, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, enrichment, and loading of data into a variety of cloud data platforms, including AWS and Microsoft Azure
• Identify gaps and improve the existing platform to improve quality, robustness, maintainability, and speed
• Evaluate new and upcoming big data solutions and make recommendations for adoption to extend our platform to meet advanced analytics use cases, such as predictive modeling and recommendation engines
• Data Modelling , Data Warehousing on Cloud Scale using Cloud native solutions.
• Perform development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions

COMPETENCIES
• Experience building, maintaining, and improving Data Models / Processing Pipeline / routing in large scale environments
• Fluency in common query languages, API development, data transformation, and integration of data streams
• Strong experience with large dataset platforms such as (e.g. Amazon EMR, Amazon Redshift, AWS Lambda & Fargate, Amazon Athena, Azure SQL Database, Azure Database for PostgreSQL, Azure Cosmos DB , Data Bricks)
• Fluency in multiple programming languages, such as Python, Shell Scripting, SQL, Java, or similar languages and tools appropriate for large scale data processing.
• Experience with acquiring data from varied sources such as: API, data queues, flat-file, remote databases
• Understanding of traditional Data Warehouse components (e.g. ETL, Business Intelligence Tools)
• Creativity to go beyond current tools to deliver the best solution to the problem
Read more
Gurugram, Pune, Mumbai, Bengaluru (Bangalore), Chennai, Nashik
4 - 12 yrs
₹12L - ₹15L / yr
Data engineering
Data modeling
data pipeline
Data integration
Data Warehouse (DWH)
+12 more

 

 

Designation – Deputy Manager - TS


Job Description

  1. Total of  8/9 years of development experience Data Engineering . B1/BII role
  2. Minimum of 4/5 years in AWS Data Integrations and should be very good on Data modelling skills.
  3. Should be very proficient in end to end AWS Data solution design, that not only includes strong data ingestion, integrations (both Data @ rest and Data in Motion) skills but also complete DevOps knowledge.
  4. Should have experience in delivering at least 4 Data Warehouse or Data Lake Solutions on AWS.
  5. Should be very strong experience on Glue, Lambda, Data Pipeline, Step functions, RDS, CloudFormation etc.
  6. Strong Python skill .
  7. Should be an expert in Cloud design principles, Performance tuning and cost modelling. AWS certifications will have an added advantage
  8. Should be a team player with Excellent communication and should be able to manage his work independently with minimal or no supervision.
  9. Life Science & Healthcare domain background will be a plus

Qualifications

BE/Btect/ME/MTech

 

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
6 - 12 yrs
₹11L - ₹25L / yr
PL/SQL
MySQL
SQL server
SQL
Linux/Unix
+4 more

We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.



Job Description :

Experience: 6+ Years

Work Location: Pune / Hyderabad



Technical Skills :

  • Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
  • Knowledge of database performance tuning techniques
  • Rich experience in a database development
  • Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
  • Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  •  

Required Candidate Profile :

  • Excellent communication, interpersonal, analytical skills and strong ability to drive teams
  • Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
  • Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
  • Stakeholder management and client engagement skills
  • Strong communication skills (written and verbal)

About Us!

A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

We have our own products!

Eagle Data warehouse Assessment & Migration Planning Product

Raven Automated Workload Conversion Product

Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.



Why join us!

Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.



Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy



Check out more about us on our website below!

www.datametica.com

Read more
The other Fruit

at The other Fruit

1 video
3 recruiters
Dipendra SIngh
Posted by Dipendra SIngh
Pune
1 - 5 yrs
₹3L - ₹15L / yr
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
skill iconPython
Data Structures
Algorithms
+17 more
 
SD (ML and AI) job description:

Advanced degree in computer science, math, statistics or a related discipline ( Must have master degree )
Extensive data modeling and data architecture skills
Programming experience in Python, R
Background in machine learning frameworks such as TensorFlow or Keras
Knowledge of Hadoop or another distributed computing systems
Experience working in an Agile environment
Advanced math skills (Linear algebra
Discrete math
Differential equations (ODEs and numerical)
Theory of statistics 1
Numerical analysis 1 (numerical linear algebra) and 2 (quadrature)
Abstract algebra
Number theory
Real analysis
Complex analysis
Intermediate analysis (point set topology)) ( important )
Strong written and verbal communications
Hands on experience on NLP and NLG
Experience in advanced statistical techniques and concepts. ( GLM/regression, Random forest, boosting, trees, text mining ) and experience with application.
 
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort