Deep Learning Engineer

at TIGI HR Solution Pvt. Ltd.

DP
Posted by Happy Kantesariya
icon
Bengaluru (Bangalore)
icon
3 - 5 yrs
icon
₹7L - ₹15L / yr
icon
Full time
Skills
Python
C++
CUDA
TensorFlow
PyTorch
Linux administration
OpenCV
ROS
Deep Learning

About the role:

Looking for an engineer to apply Deep Learning algorithms to implement and improve perception algorithms related to Autonomous vehicles. The position requires you to work on the full life-cycle of Deep learning development including data collection, feature engineering, model training, and testing. One will have the opportunity to implement a state-of-the-art Deep learning algorithm and apply it to real end-to-end production. You will be working with the team and team lead on challenging Deep Learning projects to deliver product quality improvements.

Responsibilities:

  • Build novel architectures for classifying, detecting, and tracking objects.
  • Develop efficient Deep Learning architectures that can run in real-time on NVIDIA devices.
  • Optimize the stack for deployment on embedded devices
  • Work on Data pipeline – Data Acquisition, pre-processing, and analysis.
  •  

Skillsets:

  • Languages: C++, Python.
  • Frameworks: CUDA, TensorRT, Pytorch, Tensorflow, ONNX.
  • Good understanding of Linux and Version Control (Git, GitHub, GitLab).
  • Experienced with OpenCV, Deep Learning to solve image domain problems.
  • Strong understanding of ROS.
  • Skilled with software design, development, and bug-fixing.
  • Coordinate with team members for the development and maintenance of the package.
  • Strong mathematical skills and understanding of probabilistic techniques.
  • Experience handling large data sets efficiently.
  • Experience with deploying Deep Learning models for real-time applications on Nvidia platforms like Drive AGX Pegasus, Jetson AGX Xavier, etc.


Add On Skills:

  • Frameworks: Pytorch Lighting
  • Experience with autonomous robots
  • OpenCV projects, Deep Learning projects
  • Experience with 3D data and representations (point clouds, meshes, etc.)
  • Experience with a wide variety of Deep learning Models (e.g: LSTM, RNN, CNN, GAN, etc.)

About TIGI HR Solution Pvt. Ltd.

Works with a new and technologically advanced way to hire most productive and quality talent in less time. Let's experience the new-edge of recruitment with us.
Founded
2014
Type
Services
Size
employees
Stage
Profitable
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Data Engineer

at Slintel

Agency job
via Qrata
Big Data
ETL
Apache Spark
Spark
Data engineer
Data engineering
Linux/Unix
MySQL
Python
Amazon Web Services (AWS)
icon
Bengaluru (Bangalore)
icon
4 - 9 yrs
icon
₹20L - ₹28L / yr
Responsibilities
  • Work in collaboration with the application team and integration team to design, create, and maintain optimal data pipeline architecture and data structures for Data Lake/Data Warehouse.
  • Work with stakeholders including the Sales, Product, and Customer Support teams to assist with data-related technical issues and support their data analytics needs.
  • Assemble large, complex data sets from third-party vendors to meet business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Elasticsearch, MongoDB, and AWS technology.
  • Streamline existing and introduce enhanced reporting and analysis solutions that leverage complex data sources derived from multiple internal systems.

Requirements
  • 5+ years of experience in a Data Engineer role.
  • Proficiency in Linux.
  • Must have SQL knowledge and experience working with relational databases, query authoring (SQL) as well as familiarity with databases including Mysql, Mongo, Cassandra, and Athena.
  • Must have experience with Python/Scala.
  • Must have experience with Big Data technologies like Apache Spark.
  • Must have experience with Apache Airflow.
  • Experience with data pipeline and ETL tools like AWS Glue.
  • Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
Job posted by
Prajakta Kulkarni

Head of Engineering

at 60 Decibels

Founded 2019  •  Products & Services  •  20-100 employees  •  Raised funding
CI/CD
SaaS
Ruby
Ruby on Rails (ROR)
Javascript
Python
PostgreSQL
Amazon Web Services (AWS)
icon
Bengaluru (Bangalore)
icon
10 - 15 yrs
icon
₹60L - ₹70L / yr

Head of Engineering

 

https://60decibels.com/" target="_blank">60 Decibels is an impact measurement company that makes it easy to listen to the people who matter most. We believe that the best way to understand social impact is by talking to the people experiencing that impact. It sounds obvious when you say it, but that is not the typical practice for many impact investors, corporations and foundations working to create social change.

 

We collect social Impact data directly from beneficiaries (customers / employees / suppliers) using our network of 1000+ trained researchers in 70+ countries. We do it quickly and without the fuss typically associated with measuring social impact. Our researchers speak directly to customers to understand their lived experience; and our team turns all this data into benchmarked social performance reports, with accompanying insights, to help our clients demonstrate and improve social performance.

 

If you want to help build interesting solutions to help social enterprises that are solving some of the world’s most challenging problems, then read on! We're looking for a Head of Engineering to join our global team full-time, in a hybrid role for our Bengaluru office.

 

About the Position

 

We are seeking a full-time Head of Engineering, someone to lead a team of developers while solving real user problems through smart and efficient application of technical knowledge and tools. You will be working closely with a multidisciplinary team, and will be responsible for working with the team to translate product specs into clean, functional, production-ready code and to further develop features of our platform across the full technical stack.

 

You will have the opportunity to work across many types of projects across the organization. Your primary responsibility will be advancing our integrated data capture and insights platform (Ruby/React/PostgreSQL) and associated tooling (Python),which will involve, in part:

 

  • Working with the Leadership, Product and Operations teams, and leading the Engineering team on requirements gathering, specifications and scoping for feature development & product initiatives
  • Designing, developing, testing and maintaining robust applications and interfaces to a high level of quality, reliability and scalability
  • Anticipating and leading the definition of the systems architecture vision to better support our team’s needs
  • Growing our technical capacity by mentoring other engineers and interviewing candidates
  • Collaborating with team members to determine best practices and requirements for our software stack
  • Participating in code reviews and model good development practices (such as test-writing)
  • Troubleshooting coding problems quickly and efficiently to ensure a productive workplace

 

About you

 

First and foremost, you bring passion and dedication to this work because it matters to you. You are a pragmatic and product-driven engineer who is interested in solving user problems and delivering value while taking into account tradeoffs between Business and Tech. You have a bias towards action: you get your hands dirty and actively tackle problems in a way that leads to the best outcomes and brings teams together. You successfully balance flexibility and rigor, using informed judgement to make decisions. You model critical thinking and introspection, taking strategic risks and growing from mistakes. You are decisive and bold, have a growth mindset, are an excellent communicator, and know the value of being a part of an effective team. More specifically, you bring:

  • 10+ years of experience in a software engineering role, preferably building a SaaS product. You can demonstrate the impact that your work has had on the product and/or the team
  • Deep knowledge of frameworks we use (e.g. Ruby on Rails, React), or the interest and ability of picking up new languages and frameworks quickly
  • Professional experience building production-ready, data-intensive, applications and APIs
  • Professional experience with developing and deploying applications using cloud service providers (eg. AWS, GCP, Azure, GitHub CI/CD)
  • Demonstrated knowledge of web applications, cybersecurity, open-source technologies
  • Demonstrated ability to lead a team
  • Outstanding collaboration and communication skills with both tech and non-tech teams/stakeholders.

 

Working with 60 Decibels

 

We are a fun, international and highly-motivated team who believes that team members should have the opportunity to expand their skills and career in a supportive environment. We currently have offices in New York, London, Nairobi and Bengaluru. Please note that permanent work authorization in one of these geographies is required.

 

We offer a competitive salary, the opportunity to work flexibly and in a fun, supportive working environment.

 

As a growing company, we are building towards a more universally accessible workplace for our employees. At this time, we do use some cloud-based technologies that are not compatible with screen readers and other assistive devices. We would be happy to discuss accessibility at 60 Decibels in greater depth during the recruitment process.

 

Want to get to know a little better?

> Sign up to receive https://us20.campaign-archive.com/home/?u=eb8a3471cbcc7f7bb20ae1019&;id=4f8f9fc97a" target="_blank">The Volume, our monthly collection of things worth reading.

> Visit our website at http://www.60decibels.com/" target="_blank">60decibels.com.

> Read about our team values https://drive.google.com/a/60decibels.com/file/d/1XxQkrGpNrwQzuHBzq3KqNVVASoq_IZM9/view?usp=sharing" target="_blank">here.

Job posted by
Jay Batavia

AI Architect

at mavQ

Founded 2020  •  Product  •  100-500 employees  •  Raised funding
Artificial Intelligence (AI)
Machine Learning (ML)
Data Science
Natural Language Processing (NLP)
Computer Vision
Deep Learning
MLOps
icon
Remote, Hyderabad
icon
6 - 13 yrs
icon
₹42L - ₹65L / yr

What You’ll Do:

  • Accurate translation of business needs into a conceptual and technical architecture design of AI models and solutions
  • Collaboration with developers and engineering teams resolving challenging tasks and ensuring proposed design is properly implemented
  • Strategy for managing the changes to the AI models (new business needs, technology changes, model retraining, etc.)
  • Collaborate with business partners and clients for AI solutioning and use cases. Provide recommendations to drive alignment with business teams
  • Define and implement evaluation strategies for each model, demonstrate applicability and performance of the model, and identify its limits
  • Design complex system integrations of AI technologies with API-driven platforms, using best practices for security and performance
  • Experience in languages, tools & technologies such as Python, Tensorflow, Pytorch, Kubernetes, Docker, etc
  • Experience with MLOps tools (like TFx, Tensorflow Serving, KubeFlow, etc.) and methodologies for CI/CD of ML models
  • Proactively identify and address technical strengths, weaknesses, and opportunities across the AI and ML domain
  • Strategic direction for maximizing simplification and re-use lowering overall TCO


What You’ll Bring:

  • Minimum 10 years of hands-on experience in the IT field, at least 6+ years in  Data Science/ ML / AI implementation based Products and Solutions 
  • Experience with Computer Vision - Vision AI  & Document AI
  • Must be hands-on with Python programming language, MLOps, Tensorflow, Pytorch, Keras, Scikit, etc.,
  • Well versed with deep learning concepts, computer vision, image processing, document processing, convolutional neural networks and data ontology applications
  • Proven track record at execution of projects in agile & cross-functional teams
  • Published research papers and represented in reputable AI conferences and the ability to lead and drive research mindset across the team
  • Good to have experience with GCP / Microsoft Azure / Amazon Web Services
  • Ph.D. or Masters in a quantitative field such as Computer Science, IT, Stats/Maths

 

What we offer:

  • Group Medical Insurance (Family Floater Plan - Self + Spouse + 2 Dependent Children)
    • Sum Insured: INR 5,00,000/- 
    • Maternity cover upto two children
    • Inclusive of COVID-19 Coverage
    • Cashless & Reimbursement facility
    • Access to free online doctor consultation

  • Personal Accident Policy (Disability Insurance) -
  • Sum Insured: INR. 25,00,000/- Per Employee
  • Accidental Death and Permanent Total Disability is covered up to 100% of Sum Insured
  • Permanent Partial Disability is covered as per the scale of benefits decided by the Insurer
  • Temporary Total Disability is covered

  • An option of Paytm Food Wallet (up to Rs. 2500) as a tax saver  benefit
  • Monthly Internet Reimbursement of upto Rs. 1,000 
  • Opportunity to pursue Executive Programs/ courses at top universities globally
  • Professional Development opportunities through various MTX sponsored certifications on multiple technology stacks including Google Cloud, Amazon & others.
Job posted by
Prakash Guptha

Senior Data Engineer

at Velocity.in

Founded 2019  •  Product  •  20-100 employees  •  Raised funding
ETL
Informatica
Data Warehouse (DWH)
Data engineering
Oracle
PostgreSQL
DevOps
Amazon Web Services (AWS)
NodeJS (Node.js)
Ruby on Rails (ROR)
React.js
Python
icon
Bengaluru (Bangalore)
icon
4 - 9 yrs
icon
₹15L - ₹35L / yr

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 5+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 5+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

Job posted by
Newali Hazarika

Cyber Security Intern - Remote

at Spacenos

Founded 2015  •  Product  •  20-100 employees  •  Raised funding
C++
Java
icon
Remote only
icon
0 - 1 yrs
icon
₹8,000 - ₹12,000 / mo

We are looking for a skilled cyber security intern with experience in protecting the computer and network from cyber attackers.


Your life at Spacenos as a Cyber Security Intern:

Spacenos helps our team grow and explore in the fields of their choice. You will get to work with various departments to understand how to analyze cyber threats and protect and maintain networks. Imagine waking up in the morning, grabbing a cup of coffee and sitting with the CEO and the Spacenos team to intellectually challenge yourself with new projects every single day. Our exciting work culture and product plans will help you gain valuable experience and motivate you to strive for greater heights.


What you will need
:

  1. Knowledge with the operating systems Unix, Linux, and Windows
  2. Understanding of SaaS and cloud computing models
  3. Skills in cybersecurity and awareness
  4. Knowledge of how to use forensic tools is required.
  5. Java, C, C++, and PHP are examples of programming languages you have to have an understanding of.
  6. Strong communication and decision-making abilities are required.
  7. Vulnerability awareness and penetration testing


What you will be doing
:

  1. Security access should be monitored and data should be kept up to date.
  2. Install and advise on the best tools and countermeasures.
  3. Instilling, in the team, a sense of computer security and procedure.
  4. Analyse cyber dangers and report on them, gathering information from both external and internal sources.
  5. Vulnerabilities and risk analysis
  6. Examine the security flaws and determine the fundamental causes.
  7. Monitor and report on the company's incidents to the disaster recovery planners.
  8. Achieve the security needs, collaborate with the providers.


About Us

Spacenos is a company which is innovating in the healthcare, finance and marketing domain since 2015 and won multiple awards and recognitions from more than 30+ MNCs and Fortune 500 companies. Funded & supported by Govt. of Karnataka, Angel Investors and International Grants.

Hiring Process:

  1. Apply for your CV and past work to be reviewed.
  2. Receive a telephonic interview or assessment upon filling the final step form.
  3. Receive offer letter if selected.


Hiring Duration:

Our hiring process takes less than 24 hours from the time you receive the Final Step form.


Validity: Up to Dec 2023

- Apply soon, the earliest applicant would be preferred over the late applicants.

Job posted by
Venkatesh Devale
PySpark
Data engineering
Big Data
Hadoop
Spark
Python
AWS Lambda
SQL
hadoop
kafka
icon
Bengaluru (Bangalore)
icon
6 - 8 yrs
icon
₹8L - ₹15L / yr
6-8years of experience in data engineer
Spark
Hadoop
Big Data
Data engineering
PySpark
Python
AWS Lambda
SQL
hadoop
kafka
Job posted by
Prashma S R

Data Engineer

at Futurense Technologies

Founded 2020  •  Services  •  20-100 employees  •  Bootstrapped
ETL
Data Warehouse (DWH)
Apache Hive
Informatica
Data engineering
Python
SQL
Amazon Web Services (AWS)
Snow flake schema
SSIS
icon
Bengaluru (Bangalore)
icon
2 - 7 yrs
icon
₹6L - ₹12L / yr
1. Create and maintain optimal data pipeline architecture
2. Assemble large, complex data sets that meet business requirements
3. Identify, design, and implement internal process improvements
4. Optimize data delivery and re-design infrastructure for greater scalability
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
6. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
7. Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs
8. Create data tools for analytics and data scientist team members
 
Skills Required:
 
1. Working knowledge of ETL on any cloud (Azure / AWS / GCP)
2. Proficient in Python (Programming / Scripting)
3. Good understanding of any of the data warehousing concepts (Snowflake / AWS Redshift / Azure Synapse Analytics / Google Big Query / Hive)
4. In-depth understanding of principles of database structure
5.  Good understanding of any of the ETL technologies (Informatica PowerCenter / AWS Glue / Data Factory / SSIS / Spark / Matillion / Talend / Azure)
6. Proficient in SQL (query solving)
7. Knowledge in Change case Management / Version Control – (VSS / DevOps / TFS / GitHub, Bit bucket, CICD Jenkin)
Job posted by
Rajendra Dasigari

Consultant - Data Science

at Affine Analytics

Founded 2011  •  Services  •  100-1000 employees  •  Profitable
DSG
SQL
MS-PowerPoint
MS-Excel
Machine Learning (ML)
Deep Learning
Statistical Modeling
icon
Bengaluru (Bangalore)
icon
4 - 8 yrs
icon
₹10L - ₹20L / yr

Responsibilities:

Complete accountability for delivering 1-2 projects from conception to implementation

Managing the team of Associates and Senior Associates

Interviewing client intelligently to learn important figures and gather requirements

Managing project timing, client expectations and meeting deadlines

Creating smart & impactful PowerPoint presentations

Playing proactive role in business development (preparing sales collaterals, pitch documents etc.)

and organization building (training, recruitment etc.)

Presentation of final results to the client and discuss further opportunities within/outside the

project along with maintaining documentation and reports

Plan deliverables and milestones for the project that you are responsible for

Provide business analysis and business area assessment

Facilitate meetings within the team on regular basis

Track and report team hours

Behavioural Competences:

This candidate must have the ability to think strategically and analytically in order to effectively

assess each assignment

Excellent written and oral communication skills

Ability to work in tight deadlines & under pressure

Excellent interpersonal & organizational skills

Good listening and comprehension and management skills

Willingness to travel/work abroad

Job posted by
Shivam Mantry

Data Scientist

at One Labs

Founded 2015  •  Product  •  20-100 employees  •  Raised funding
Data Science
Deep Learning
Python
Keras
TensorFlow
Machine Learning (ML)
icon
NCR (Delhi | Gurgaon | Noida)
icon
1 - 3 yrs
icon
₹3L - ₹6L / yr

Job Description


We are looking for a data scientist that will help us to discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be in applying data mining techniques, doing statistical analysis, and building high quality prediction systems integrated with our products. 

Responsibilities

  • Selecting features, building and optimizing classifiers using machine learning techniques
  • Data mining using state-of-the-art methods
  • Extending company’s data with third party sources of information when needed
  • Enhancing data collection procedures to include information that is relevant for building analytic systems
  • Processing, cleansing, and verifying the integrity of data used for analysis
  • Doing ad-hoc analysis and presenting results in a clear manner
  • Creating automated anomaly detection systems and constant tracking of its performance

Skills and Qualifications

  • Excellent understanding of machine learning techniques and algorithms, such as Linear regression, SVM, Decision Forests, LSTM, CNN etc.
  • Experience with Deep Learning preferred.
  • Experience with common data science toolkits, such as R, NumPy, MatLab, etc. Excellence in at least one of these is highly desirable
  • Great communication skills
  • Proficiency in using query languages such as SQL, Hive, Pig 
  • Good applied statistics skills, such as statistical testing, regression, etc.
  • Good scripting and programming skills 
  • Data-oriented personality
Job posted by
Rahul Gupta

Analyst - Business Analytics

at LatentView Analytics

Founded 2006  •  Products & Services  •  100-1000 employees  •  Profitable
Business Intelligence (BI)
Analytics
SQL server
Data Visualization
Tableau
Business-IT alignment
Communication Skills
Science
Problem solving
Python
Looker
EDX
icon
Chennai
icon
1 - 4 yrs
icon
₹2L - ₹10L / yr
Title: Analyst - Business AnalyticsExperience: 1 - 4 YearsLocation: ChennaiOpen Positions: 17Job Description:Roles & Responsibilities:- Designing and implementing analytical projects that drive business goals and decisions leveraging structured and unstructured data.- Generating a compelling story from insights and trends in a complex data environment.- Working shoulder-to-shoulder with business partners to come up with creative approaches to solve the business problem.- Creating dashboards for business heads by exploring available data assets.Qualifications:- Overall 1+ Years of Business Analytics experience with strong communication skills.- Bachelor or Master degree in computer science is preferred.- Excellent problem solving and client orientation skills.Skills Required:- Ability to program in Advanced SQL is must.- Hands-on experience in Modeling tools such as R or Python- Experience in Visualization tools such as Power BI, Tableau, Looker, etc., would be a big plus.- Analytics certifications from recognized platforms would be a plus - Udemy, Coursera, EDX,etc.
Job posted by
Kannikanti madhuri
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at TIGI HR Solution Pvt. Ltd.?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort