Cutshort logo
Wissen Technology logo
Data Engineer
Wissen Technology's logo

Data Engineer

Tony Tom's profile picture
Posted by Tony Tom
6 - 12 yrs
₹2L - ₹30L / yr
Pune
Skills
Python
AWS
Spark

Location: Pune

Required Skills : Scala, Python, Data Engineering, AWS, Cassandra/AstraDB, Athena, EMR, Spark/Snowflake


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About Wissen Technology

Founded :
2000
Type :
Products & Services
Size :
1000-5000
Stage :
Profitable

About

The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.

With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.


Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.


We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).


Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.


Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.

Read more

Connect with the team

Profile picture
Lokesh Manikappa
Profile picture
Vijayalakshmi Selvaraj
Profile picture
Adishi Sood
Profile picture
Shiva Kumar J Goud

Company social profiles

bloglinkedinfacebook

Similar jobs

Deltek
Bengaluru (Bangalore)
6 - 9 yrs
Best in industry
Python
api
Software architecture
  • Having experience with Python,API,Architectural Design
  • A minimum of 7-9 years of experience in developing integration/automation solutions or related experience
  • 3-4 years of experience in a technical architect or lead role
  • Experience working on any iPass platform.
  • Knowledge of any programming languages. Python preferred
  • Good understanding of integration concepts, methodologies, and technologies
  • Ability to learn new concepts, and technologies and solve problems
  • Good communication and presentation skills
  • Strong interpersonal skills with the ability to convey and relate ideas to others and work collaboratively to get things done.


Read more
Remote only
2 - 9 yrs
₹5L - ₹12L / yr
skill iconPython
python scripting
Databases
LLM integration
Prompt engineering
+11 more

Role & Responsibilities:

We are seeking a Software Developer with 2-10 year’s experience with strong foundations in Python, databases, and AI technologies. The ideal candidate will support the development of AI-powered solutions, focusing on LLM integration, prompt engineering, and database-driven workflows. This is a hands-on role with opportunities to learn and grow into advanced AI engineering responsibilities.


Key Responsibilities

• Develop, test, and maintain Python-based applications and APIs.

• Design and optimize prompts for Large Language Models (LLMs) to improve accuracy and performance.

• Work with JSON-based data structures for request/response handling. • Integrate and manage PostgreSQL (pgSQL) databases, including writing queries and handling data pipelines.

• Collaborate with the product and AI teams to implement new features. • Debug, troubleshoot, and optimize performance of applications and workflows.

• Stay updated on advancements in LLMs, AI frameworks, and generative AI tools.


Required Skills & Qualifications


• Strong knowledge of Python (scripting, APIs, data handling).

• Basic understanding of Large Language Models (LLMs) and prompt engineering techniques.

• Experience with JSON data parsing and transformations.

• Familiarity with PostgreSQL or other relational databases.

• Ability to write clean, maintainable, and well-documented code.

• Strong problem-solving skills and eagerness to learn.

• Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent practical experience).


Nice-to-Have (Preferred)

• Exposure to AI/ML frameworks (e.g., LangChain, Hugging Face, OpenAI APIs).

• Experience working in startups or fast-paced environments.

• Familiarity with version control (Git/GitHub) and cloud platforms (AWS, GCP, or Azure).


What We Offer

• The opportunity to define the future of GovTech through AI-powered solutions.

• A strategic leadership role in a fast-scaling startup with direct impact on product direction and market success.

• Collaborative and innovative environment with cross-functional exposure.

• Growth opportunities backed by a strong leadership team.

• Remote flexibility and work-life balance.

Read more
OnActive
Mansi Gupta
Posted by Mansi Gupta
Gurugram, Pune, Bengaluru (Bangalore), Chennai, Bhopal, Hyderabad, Jaipur
5 - 8 yrs
₹6L - ₹12L / yr
skill iconPython
Spark
SQL
AWS CloudFormation
skill iconMachine Learning (ML)
+3 more

Level of skills and experience:


5 years of hands-on experience in using Python, Spark,Sql.

Experienced in AWS Cloud usage and management.

Experience with Databricks (Lakehouse, ML, Unity Catalog, MLflow).

Experience using various ML models and frameworks such as XGBoost, Lightgbm, Torch.

Experience with orchestrators such as Airflow and Kubeflow.

Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes).

Fundamental understanding of Parquet, Delta Lake and other data file formats.

Proficiency on an IaC tool such as Terraform, CDK or CloudFormation.

Strong written and verbal English communication skill and proficient in communication with non-technical stakeholderst

Read more
Building the world's largest search intelligence products.
Building the world's largest search intelligence products.
Agency job
via Qrata by Prajakta Kulkarni
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹18L / yr
skill iconJava
skill iconPython
skill iconMachine Learning (ML)
XSD
skill iconXML
+10 more

About the Role-

Thinking big and executing beyond what is expected. The challenges cut across algorithmic problem solving, systems engineering, machine learning and infrastructure at a massive scale.

Reason to Join-

An opportunity for innovators, problem solvers & learners.  Working will be Innovative, empowering, rewarding & fun. Amazing Office, competitive pay along with excellent benefits package.

 

Requiremets and Responsibilities- (please read carefully before applying)

  • The overall experience of 3-6 years in Java/Python Framework and Machine Learning.
  • Develop Web Services, REST, XSD, XML technologies, Java, Python, AWS, API.
  • Experience on Elastic Search or SOLR or Lucene -Search Engine, Text Mining, Indexing.
  • Experience in highly scalable tools like Kafka, Spark, Aerospike, etc.
  • Hands on experience in Design, Architecture, Implementation, Performance & Scalability, and Distributed Systems.
  • Design, implement, and deploy highly scalable and reliable systems.
  • Troubleshoot Solr indexing process and querying engine.
  • Bachelors or Masters in Computer Science from Tier 1 Institutions
Read more
Persistent System Ltd
Persistent System Ltd
Agency job
via Milestone Hr Consultancy by Haina khan
Bengaluru (Bangalore), Pune, Hyderabad
4 - 6 yrs
₹6L - ₹22L / yr
Apache HBase
Apache Hive
Apache Spark
skill iconGo Programming (Golang)
skill iconRuby on Rails (ROR)
+5 more
Urgently require Hadoop Developer in reputed MNC company

Location: Bangalore/Pune/Hyderabad/Nagpur

4-5 years of overall experience in software development.
- Experience on Hadoop (Apache/Cloudera/Hortonworks) and/or other Map Reduce Platforms
- Experience on Hive, Pig, Sqoop, Flume and/or Mahout
- Experience on NO-SQL – HBase, Cassandra, MongoDB
- Hands on experience with Spark development,  Knowledge of Storm, Kafka, Scala
- Good knowledge of Java
- Good background of Configuration Management/Ticketing systems like Maven/Ant/JIRA etc.
- Knowledge around any Data Integration and/or EDW tools is plus
- Good to have knowledge of  using Python/Perl/Shell

 

Please note - Hbase hive and spark are must.

Read more
Number Theory
at Number Theory
3 recruiters
Nidhi Mishra
Posted by Nidhi Mishra
Gurugram
2 - 5 yrs
₹12L - ₹13L / yr
skill iconJava
skill iconScala
Big Data
Spark
skill iconAmazon Web Services (AWS)

Experience:

The candidate should have about 2+ years of experience with design and development in Java/Scala. Experience in algorithm, data-structure, database and distributed System is mandatory.

 

Required Skills:

Mandatory: -

  1. Core Java or Scala
  2. Experience in Big Data, Spark
  3. Extensive experience in developing spark job. Should possess good Oops knowledge and be aware of enterprise application design patterns.
  4. Should have the ability to analyze, design, develop and test complexity of spark job.
  5. Working knowledge of Unix/Linux.
  6. Hands on experience in Spark, creating RDD, applying operation - transformation-action

Good To have: -

  1. Python
  2. Spark streaming
  3. Py Spark
  4. Azure/AWS Cloud Knowledge of Data Storage and Compute side

 



Read more
Technology service company
Technology service company
Agency job
via Jobdost by Riya Roy
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Ansible
+11 more
  • Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.

  • Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
    ▪ Distributed Cloud Native Computing including Server less Functions
    ▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
    ▪ Micro services Architecture, API Modeling, Design, & Programming

  • 3+ years of hands-on development experience in Apache Spark using Scala and/or Java.

  • Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.

  • In-depth knowledge of standard programming languages such as Scala and/or Java.

  • 3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.

  • 3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.

  • Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.

  • Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.

  • Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.

  • Perform benchmarking/stress tests and document the best practices for different applications.

  • Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.

  • Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.

  • Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.

    Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.

  • Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.

  • Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.

  • Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.

Read more
AI Educator
Gajendra Rayaghada
Posted by Gajendra Rayaghada
Remote, Hyderabad
2 - 3 yrs
₹3L - ₹8L / yr
skill iconPython
skill iconDjango
RESTful APIs
skill iconJavascript
skill iconPostgreSQL
+12 more
Python Django Developer

Job Description for Python Backend Developer
2 + years expertise in Python 3.7, Django 2 (or Django 3).
Familiarity with some ORM (Object Relational Mapper) libraries.
Able to integrate multiple data sources and databases into one system.
Integration of user-facing elements developed by front-end developers with server-side logic in Django (RESTful APIs).
Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
Knowledge of user authentication and authorization between multiple systems, servers, and environments
Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform
Able to create database schemas that represent and support business processes
Strong unit test and debugging skills.
Proficient understanding of code versioning tools such as Git.
The desirables optionals
Django Channels, Web Sockets, Asyncio.
Experience working with AWS or similar Cloud services.
Experience in containerization technologies such as Docker.
Understanding of fundamental design principles behind a scalable application (caching, Redis)
Role: Software Developer
Industry Type: IT-Software, Software Services
Employment Type Full Time
Role Category Programming & Design
Qualification: Any Graduate in Any Specialization
Key Skills – Python 3.7 Django 2.0 onwards , REST APIs , ORM, Front End for interfacing only ( curl, Postman, Angular for testing), Docker (optional), database (PostgreSQL), Github
Read more
Healofy
at Healofy
3 recruiters
Shubham Maheshwari
Posted by Shubham Maheshwari
Bengaluru (Bangalore)
1 - 7 yrs
₹15L - ₹40L / yr
skill iconJava
Google App Engine (GAE)
Apache Kafka
NOSQL Databases
Firebase
+3 more
RESPONSIBILITIES: 1. Full ownership of Tech right from driving product decisions to architect to deployment. 2. Develop cutting edge user experience and build cutting edge technology solutions like instant messaging in poor networks, live-discussions, live-videos optimal matching. 3. Using Billions of Data Points to Build User Personalisation Engine 4. Building Data Network Effects Engine to increase Engagement & Virality 5. Scaling the Systems to Billions of Daily Hits. 6. Deep diving into performance, power management, memory optimisation & network connectivity optimisation for the next Billion Indians 7. Orchestrating complicated workflows, asynchronous actions, and higher order components 8. Work directly with Product and Design teams REQUIREMENTS: 1. Should have Hacked some (computer or non-computer) system to your advantage. 2. Built and managed systems with a scale of 10Mn+ Daily Hits 3. Strong architectural experience 4. Strong experience in memory management, performance tuning and resource optimisations 5. PREFERENCE- If you are a woman or an ex-entrepreneur or having a CS bachelor’s degree from IIT/BITS/NIT P.S. If you don't fulfil one of the requirements, you need to be exceptional in the others to be considered.
Read more
Saama Technologies
at Saama Technologies
6 recruiters
Sandeep Chaudhary
Posted by Sandeep Chaudhary
Pune
6 - 11 yrs
₹1L - ₹12L / yr
skill iconData Analytics
MySQL
skill iconPython
Spark
Tableau
Description Requirements: Overall experience of 10 years with minimum 6 years data analysis experience MBA Finance or Similar background profile Ability to lead projects and work independently Must have the ability to write complex SQL, doing cohort analysis, comparative analysis etc . Experience working directly with business users to build reports, dashboards and solving business questions with data Experience with doing analysis using Python and Spark is a plus Experience with MicroStrategy or Tableau is a plu
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos