Cutshort logo
6sense logo
Senior Software Engineer - Data
Senior Software Engineer - Data
6sense's logo

Senior Software Engineer - Data

Romesh Rawat's profile picture
Posted by Romesh Rawat
5 - 8 yrs
₹30L - ₹45L / yr
Remote only
Skills
Spark
Hadoop
Big Data
Data engineering
PySpark
Apache Spark
skill iconPython
ETL
skill iconAmazon Web Services (AWS)

About Slintel (a 6sense company) :

Slintel, a 6sense company,  the leader in capturing technographics-powered buying intent, helps companies uncover the 3% of active buyers in their target market. Slintel evaluates over 100 billion data points and analyzes factors such as buyer journeys, technology adoption patterns, and other digital footprints to deliver market & sales intelligence.

Slintel's customers have access to the buying patterns and contact information of more than 17 million companies and 250 million decision makers across the world.

Slintel is a fast growing B2B SaaS company in the sales and marketing tech space. We are funded by top tier VCs, and going after a billion dollar opportunity. At Slintel, we are building a sales development automation platform that can significantly improve outcomes for sales teams, while reducing the number of hours spent on research and outreach.

We are a big data company and perform deep analysis on technology buying patterns, buyer pain points to understand where buyers are in their journey. Over 100 billion data points are analyzed every week to derive recommendations on where companies should focus their marketing and sales efforts on. Third party intent signals are then clubbed with first party data from CRMs to derive meaningful recommendations on whom to target on any given day.

6sense is headquartered in San Francisco, CA and has 8 office locations across 4 countries.

6sense, an account engagement platform, secured $200 million in a Series E funding round, bringing its total valuation to $5.2 billion 10 months after its $125 million Series D round. The investment was co-led by Blue Owl and MSD Partners, among other new and existing investors.

Linkedin (Slintel) : https://www.linkedin.com/company/slintel/">https://www.linkedin.com/company/slintel/

Industry : Software Development

Company size : 51-200 employees (189 on LinkedIn)

Headquarters : Mountain View, California

Founded : 2016

Specialties : Technographics, lead intelligence, Sales Intelligence, Company Data, and Lead Data.

Website (Slintel) : https://www.slintel.com/slintel">https://www.slintel.com/slintel

Linkedin (6sense) : https://www.linkedin.com/company/6sense/">https://www.linkedin.com/company/6sense/

Industry : Software Development

Company size : 501-1,000 employees (937 on LinkedIn)

Headquarters : San Francisco, California

Founded : 2013

Specialties : Predictive intelligence, Predictive marketing, B2B marketing, and Predictive sales

Website (6sense) : https://6sense.com/">https://6sense.com/

Acquisition News : 

https://inc42.com/buzz/us-based-based-6sense-acquires-b2b-buyer-intelligence-startup-slintel/ 

Funding Details & News :

Slintel funding : https://www.crunchbase.com/organization/slintel">https://www.crunchbase.com/organization/slintel

6sense funding : https://www.crunchbase.com/organization/6sense">https://www.crunchbase.com/organization/6sense

https://www.nasdaq.com/articles/ai-software-firm-6sense-valued-at-%245.2-bln-after-softbank-joins-funding-round">https://www.nasdaq.com/articles/ai-software-firm-6sense-valued-at-%245.2-bln-after-softbank-joins-funding-round

https://www.bloomberg.com/news/articles/2022-01-20/6sense-reaches-5-2-billion-value-with-softbank-joining-round">https://www.bloomberg.com/news/articles/2022-01-20/6sense-reaches-5-2-billion-value-with-softbank-joining-round

https://xipometer.com/en/company/6sense">https://xipometer.com/en/company/6sense

Slintel & 6sense Customers :

https://www.featuredcustomers.com/vendor/slintel/customers

https://www.featuredcustomers.com/vendor/6sense/customers">https://www.featuredcustomers.com/vendor/6sense/customers

About the job

Responsibilities

  • Work in collaboration with the application team and integration team to design, create, and maintain optimal data pipeline architecture and data structures for Data Lake/Data Warehouse
  • Work with stakeholders including the Sales, Product, and Customer Support teams to assist with data-related technical issues and support their data analytics needs
  • Assemble large, complex data sets from third-party vendors to meet business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Elastic search, MongoDB, and AWS technology
  • Streamline existing and introduce enhanced reporting and analysis solutions that leverage complex data sources derived from multiple internal systems

Requirements

  • 3+ years of experience in a Data Engineer role
  • Proficiency in Linux
  • Must have SQL knowledge and experience working with relational databases, query authoring (SQL) as well as familiarity with databases including Mysql, Mongo, Cassandra, and Athena
  • Must have experience with Python/ Scala
  • Must have experience with Big Data technologies like Apache Spark
  • Must have experience with Apache Airflow
  • Experience with data pipeline and ETL tools like AWS Glue
  • Experience working with AWS cloud services: EC2 S3 RDS, Redshift and other Data solutions eg. Databricks, Snowflake

 

Desired Skills and Experience

Python, SQL, Scala, Spark, ETL

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About 6sense

Founded :
2013
Type
Size
Stage :
Raised funding
About

6sense reinvents the way organizations create, manage, and convert pipeline to revenue. The 6sense Revenue AI platform captures anonymous buying signals, predicts the right accounts to target at the ideal time, and recommends the channels and messages to boost revenue performance. Removing guesswork, friction and wasted sales effort, 6sense empowers sales, marketing, and customer success teams to significantly improve pipeline quality, accelerate sales velocity, increase conversion rates, and grow revenue predictably. 6sense has been recognized for its market-defining technology by Forbes Cloud 100, G2, TrustRadius, Gartner, and Forrester, and for its strong culture by Glassdoor, Inc. Magazine, and Comparably.

Read more
Photos
Company featured pictures
Company featured pictures
Company featured pictures
Connect with the team
Profile picture
Saravanan K
Profile picture
Neha Singh
Profile picture
Tanya Dias
Profile picture
Kunjan Bhagat
Profile picture
Priyanka Verma
Profile picture
SampathKumar Venkatesh
Profile picture
Romesh Rawat
Profile picture
Bhavika Pandya
Profile picture
Shrutika Dhawan
Profile picture
Sanish Bhadbhade
Profile picture
Gyan S
Profile picture
Amruta Joshi
Profile picture
Sneha Chakraborty
Profile picture
Vineet Verma
Profile picture
Morgan Puravet
Company social profiles
bloglinkedintwitterfacebook

Similar jobs

globe teleservices
deepshikha thapar
Posted by deepshikha thapar
Bengaluru (Bangalore)
4 - 8 yrs
₹10L - ₹15L / yr
skill iconPython
SQL

RESPONSIBILITIES:

 Requirement understanding and elicitation, analyze, data/workflows, contribute to product

project and Proof of concept (POC)

 Contribute to prepare design documents and effort estimations.

 Develop AI/ML Models using best in-class ML models.

 Building, testing, and deploying AI/ML solutions.

 Work with Business Analysts and Product Managers to assist with defining functional user

stories.

 Ensure deliverables across teams are of high quality and clearly documented. 

 Recommend best ML practices/Industry standards for any ML use case.

 Proactively take up R and D and recommend solution options for any ML use case.

REQUIREMENTS:

Required Skills

 Overall experience of 4 to 7 Years working on AI/ML framework development

 Good programming knowledge in Python is must.

 Good Knowledge of R and SAS is desired.

 Good hands on and working knowledge SQL, Data Model, CRISP-DM.

 Proficiency with Uni/multivariate statistics, algorithm design, and predictive AI/ML modelling.

 Strong knowledge of machine learning algorithms, linear regression, logistic regression, KNN,

Random Forest, Support Vector Machines and Natural Language Processing.

 Experience with NLP and deep neural networks using synthetic and artificial data.

 Involved in different phases of SDLC and have good working exposure on different SLDC’s like

Agile Methodologies.

Read more
Monkfox
at Monkfox
1 recruiter
Tanu Mehra
Posted by Tanu Mehra
Anywhere
8 - 11 yrs
₹5L - ₹10L / yr
nano electronics
vehicle dynamics
computational dynamics
skill iconAndroid Development
Big Data
+3 more
We are a team with a mission, A mission to create and deliver great learning experiences to engineering students through various workshops and courses. If you are an industry professional and :- See great scope of improvement in higher technical education across the country and connect with our purpose of impacting it for good. Keen on sharing your technical expertise to enhance the practical learning of students. You are innovative in your ways of creating content and delivering them. You don’t mind earning few extra bucks while doing this in your free time. Buzz us at [email protected] and let us discuss how together we can take technological education in the country to new heights.
Read more
Acuity Knowledge Partners
at Acuity Knowledge Partners
2 candid answers
1 video
Gangadhar S
Posted by Gangadhar S
Bengaluru (Bangalore)
4 - 9 yrs
₹16L - ₹40L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
CI/CD
skill iconMongoDB
MLOps
+1 more

Job Responsibilities:

1. Develop/debug applications using Python.

2. Improve code quality and code coverage for existing or new program.

3. Deploy and Integrate the Machine Learning models.

4. Test and validate the deployments.

5. ML Ops function.


Technical Skills

1. Graduate in Engineering or Technology with strong academic credentials

2. 4 to 8 years of experience as a Python developer.

3. Excellent understanding of SDLC processes

4. Strong knowledge of Unit testing, code quality improvement

5. Cloud based deployment and integration of applications/micro services.

6. Experience with NoSQL databases, such as MongoDB, Cassandra

7. Strong applied statistics skills

8. Knowledge of creating CI/CD pipelines and touchless deployment.

9. Knowledge about API, Data Engineering techniques.

10. AWS

11. Knowledge of Machine Learning and Large Language Model.


Nice to Have

1. Exposure to financial research domain

2. Experience with JIRA, Confluence

3. Understanding of scrum and Agile methodologies

4. Experience with data visualization tools, such as Grafana, GGplot, etc

Read more
fintech startup
Agency job
via Qrata by Rayal Rajan
Pune
4 - 12 yrs
₹15L - ₹45L / yr
skill iconPython
Linear regression
Logistic regression
skill iconMachine Learning (ML)
Algorithms

The role is with a Fintech Credit Card company based in Pune within the Decision Science team. (OneCard )


About


Credit cards haven't changed much for over half a century so our team of seasoned bankers, technologists, and designers set out to redefine the credit card for you - the consumer. The result is OneCard - a credit card reimagined for the mobile generation. OneCard is India's best metal credit card built with full-stack tech. It is backed by the principles of simplicity, transparency, and giving back control to the user.



The Engineering Challenge


“Re-imaging credit and payments from First Principles”


Payments is an interesting engineering challenge in itself with requirements of low latency, transactional guarantees, security, and high scalability. When we add credit and engagement into the mix, the challenge becomes even more interesting with underwriting and recommendation algorithms working on large data sets. We have eliminated the current call center, sales agent, and SMS-based processes with a mobile app that puts the customers in complete control. To stay agile, the entire stack is built on the cloud with modern technologies.


Purpose of Role :


- Develop and implement the collection analytics and strategy function for the credit cards. Use analysis and customer insights to develop optimum strategy.


CANDIDATE PROFILE :


- Successful candidates will have in-depth knowledge of statistical modelling/data analysis tools (Python, R etc.), techniques. They will be an adept communicator with good interpersonal skills to work with senior stake holders in India to grow revenue primarily through identifying / delivering / creating new, profitable analytics solutions.


We are looking for someone who:


- Proven track record in collection and risk analytics preferably in Indian BFSI industry. This is a must.


- Identify & deliver appropriate analytics solutions


- Experienced in Analytics team management



Essential Duties and Responsibilities :


- Responsible for delivering high quality analytical and value added services


- Responsible for automating insights and proactive actions on them to mitigate collection Risk.


- Work closely with the internal team members to deliver the solution


- Engage Business/Technical Consultants and delivery teams appropriately so that there is a shared understanding and agreement as to deliver proposed solution


- Use analysis and customer insights to develop value propositions for customers


- Maintain and enhance the suite of suitable analytics products.


- Actively seek to share knowledge within the team


- Share findings with peers from other teams and management where required


- Actively contribute to setting best practice processes.


Knowledge, Experience and Qualifications :


Knowledge :


- Good understanding of collection analytics preferably in Retail lending industry.


- Knowledge of statistical modelling/data analysis tools (Python, R etc.), techniques and market trends


- Knowledge of different modelling frameworks like Linear Regression, Logistic Regression, Multiple Regression, LOGIT, PROBIT, time- series modelling, CHAID, CART etc.


- Knowledge of Machine learning & AI algorithms such as Gradient Boost, KNN, etc.


- Understanding of decisioning and portfolio management in banking and financial services would be added advantage


- Understanding of credit bureau would be an added advantage


Experience :


- 4 to 8 years of work experience in core analytics function of a large bank / consulting firm.


- Experience on working on Collection analytics is must


- Experience on handling large data volumes using data analysis tools and generating good data insights


- Demonstrated ability to communicate ideas and analysis results effectively both verbally and in writing to technical and non-technical audiences


- Excellent communication, presentation and writing skills Strong interpersonal skills


- Motivated to meet and exceed stretch targets


- Ability to make the right judgments in the face of complexity and uncertainty


- Excellent relationship and networking skills across our different business and geographies


Qualifications :


- Masters degree in Statistics, Mathematics, Economics, Business Management or Engineering from a reputed college

Read more
company logo
Agency job
via Morine tech by Sridevi L
Bengaluru (Bangalore)
4 - 9 yrs
₹10L - ₹25L / yr
skill iconPython
Vertex
Google Cloud Platform (GCP)
BigQuery
  • Cloud: GCP
  • Must have: BigQuery, Python, Vertex AI
  • Nice to have Services: Data Plex
  • Exp level: 5-10 years.
  • Preferred Industry (nice to have): Manufacturing – B2B sales
Read more
Streamoid Technologies Pvt Ltd
Agency job
via HyreSpree by HyreSpree Team
Bengaluru (Bangalore)
4 - 6 yrs
₹4L - ₹20L / yr
Natural Language Processing (NLP)
PyTorch
skill iconPython
skill iconJava
Solr
+1 more
Skill Set:
  • 4+ years of experience Solid understanding of Python, Java and general software development skills (source code management, debugging, testing, deployment etc.).
  • Experience in working with Solr and ElasticSearch Experience with NLP technologies & the handling of unstructured text Detailed understanding of text pre-processing and normalisation techniques such as tokenisation, lemmatisation, stemming, POS tagging etc.
  • Prior experience in implementation of traditional ML solutions - classification, regression or clustering problem Expertise in text-analytics - Sentiment Analysis, Entity Extraction, Language modelling - and associated sequence learning models ( RNN, LSTM, GRU).
  • Comfortable working with deep-learning libraries (eg. PyTorch)
  • Candidate can even be a fresher with 1 or 2 years of experience IIIT, IIIT, Bits Pilani, top 5 local colleges are preferred colleges and universities.
  • A Masters candidate in machine learning.
  • Can source candidates from Mu Sigma and Manthan.
Read more
Blue Sky Analytics
at Blue Sky Analytics
3 recruiters
Balahun Khonglanoh
Posted by Balahun Khonglanoh
Remote only
1 - 5 yrs
Best in industry
NumPy
SciPy
skill iconData Science
skill iconPython
pandas
+8 more

About the Company

Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!


We are looking for a data scientist to join its growing team. This position will require you to think and act on the geospatial architecture and data needs (specifically geospatial data) of the company. This position is strategic and will also require you to collaborate closely with data engineers, data scientists, software developers and even colleagues from other business functions. Come save the planet with us!


Your Role

Manage: It goes without saying that you will be handling large amounts of image and location datasets. You will develop dataframes and automated pipelines of data from multiple sources. You are expected to know how to visualize them and use machine learning algorithms to be able to make predictions. You will be working across teams to get the job done.

Analyze: You will curate and analyze vast amounts of geospatial datasets like satellite imagery, elevation data, meteorological datasets, openstreetmaps, demographic data, socio-econometric data and topography to extract useful insights about the events happening on our planet.

Develop: You will be required to develop processes and tools to monitor and analyze data and its accuracy. You will develop innovative algorithms which will be useful in tracking global environmental problems like depleting water levels, illegal tree logging, and even tracking of oil-spills.

Demonstrate: A familiarity with working in geospatial libraries such as GDAL/Rasterio for reading/writing of data, and use of QGIS in making visualizations. This will also extend to using advanced statistical techniques and applying concepts like regression, properties of distribution, and conduct other statistical tests.

Produce: With all the hard work being put into data creation and management, it has to be used! You will be able to produce maps showing (but not limited to) spatial distribution of various kinds of data, including emission statistics and pollution hotspots. In addition, you will produce reports that contain maps, visualizations and other resources developed over the course of managing these datasets.

Requirements

These are must have skill-sets that we are looking for:

  • Excellent coding skills in Python (including deep familiarity with NumPy, SciPy, pandas).
  • Significant experience with git, GitHub, SQL, AWS (S3 and EC2).
  • Worked on GIS and is familiar with geospatial libraries such as GDAL and rasterio to read/write the data, a GIS software such as QGIS for visualisation and query, and basic machine learning algorithms to make predictions.
  • Demonstrable experience implementing efficient neural network models and deploying them in a production environment.
  • Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
  • Capable of writing clear and lucid reports and demystifying data for the rest of us.
  • Be curious and care about the planet!
  • Minimum 2 years of demonstrable industry experience working with large and noisy datasets.

Benefits

  • Work from anywhere: Work by the beach or from the mountains.
  • Open source at heart: We are building a community where you can use, contribute and collaborate on.
  • Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
  • Flexible timings: Fit your work around your lifestyle.
  • Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
  • Work Machine of choice: Buy a device and own it after completing a year at BSA.
  • Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
  • Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.
Read more
Bengaluru (Bangalore)
1 - 5 yrs
₹30L - ₹40L / yr
Spark
Data Engineer
Airflow
SQL
No SQL
+1 more
  • 3-6 years of relevant work experience in a Data Engineering role.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • A good understanding of Airflow, Spark, NoSQL databases, Kafka is nice to have.
  • Premium Institute Candidates only
Read more
Fast paced Startup
Pune
3 - 6 yrs
₹15L - ₹22L / yr
Big Data
Data engineering
Hadoop
Spark
Apache Hive
+6 more

ears of Exp: 3-6+ Years 
Skills: Scala, Python, Hive, Airflow, Spark

Languages: Java, Python, Shell Scripting

GCP: BigTable, DataProc,  BigQuery, GCS, Pubsub

OR
AWS: Athena, Glue, EMR, S3, Redshift

MongoDB, MySQL, Kafka

Platforms: Cloudera / Hortonworks
AdTech domain experience is a plus.
Job Type - Full Time 

Read more
NACTUS India Services Pvt Ltd
Remote, Mumbai
3 - 5 yrs
₹5L - ₹7L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Software Development
skill iconPython
skill iconC++

Nactus is at forefront of education reinvention, helping educators and learner’s community at large through innovative solutions in digital era. We are looking for an experienced AI specialist to join our revolution using the deep learning, artificial intelligence.  This is an excellent opportunity to take advantage of emerging trends and technologies to a real-world difference.

 

Role and Responsibilities

  • Manage and direct research and development (R&D) and processes to meet the needs of our AI strategy.
  • Understand company and client challenges and how integrating AI capabilities can help create educational solutions.
  • Analyse and explain AI and machine learning (ML) solutions while setting and maintaining high ethical standards.

 

Skills Required

 

  • Knowledge of algorithms, object-oriented and functional design principles
  • Demonstrated artificial intelligence, machine learning, mathematical and statistical modelling knowledge and skills.
  • Well-developed programming skills – specifically in SAS or SQL and other packages with statistical and machine learning application, e.g. R, Python
  • Experience with machine learning fundamentals, parallel computing and distributed systems fundamentals, or data structure fundamentals
  • Experience with C, C++, or Python programming
  • Experience with debugging and building AI applications.
  • Robustness and productivity analyse conclusions.
  • Develop a human-machine speech interface.
  • Verify, evaluate, and demonstrate implemented work.
  • Proven experience with ML, deep learning, Tensorflow, Python
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos