
Location: Pune
Required Skills : Scala, Python, Data Engineering, AWS, Cassandra/AstraDB, Athena, EMR, Spark/Snowflake

About Wissen Technology
About
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.
With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.
We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).
Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Connect with the team
Similar jobs


We are seeking a skilled and innovative Developer with strong expertise in Scala, Java/Python and Spark/Hadoop to join our dynamic team.
Key Responsibilities:
• Design, develop, and maintain robust and scalable backend systems using Scala, Spark, Hadoop and expertise in Python/Java.
• Build and deploy highly efficient, modular, and maintainable microservices architecture for enterprise-level applications.
• Write and optimize algorithms to enhance application performance and scalability.
Required Skills:
• Programming: Expert in Scala and object-oriented programming.
• Frameworks: Hands-on experience with Spark and Hadoop
• Databases: Experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB).
• Location: Mumbai
• Employment Type: Full-time
Roles and Responsibilities



About the Role-
Thinking big and executing beyond what is expected. The challenges cut across algorithmic problem solving, systems engineering, machine learning and infrastructure at a massive scale.
Reason to Join-
An opportunity for innovators, problem solvers & learners. Working will be Innovative, empowering, rewarding & fun. Amazing Office, competitive pay along with excellent benefits package.
Requiremets and Responsibilities- (please read carefully before applying)
- The overall experience of 3-6 years in Java/Python Framework and Machine Learning.
- Develop Web Services, REST, XSD, XML technologies, Java, Python, AWS, API.
- Experience on Elastic Search or SOLR or Lucene -Search Engine, Text Mining, Indexing.
- Experience in highly scalable tools like Kafka, Spark, Aerospike, etc.
- Hands on experience in Design, Architecture, Implementation, Performance & Scalability, and Distributed Systems.
- Design, implement, and deploy highly scalable and reliable systems.
- Troubleshoot Solr indexing process and querying engine.
- Bachelors or Masters in Computer Science from Tier 1 Institutions


We have urgent requirement of Data Engineer/Sr Data Engineer for reputed MNC company.
Exp: 4-9yrs
Location: Pune/Bangalore/Hyderabad
Skills: We need candidate either Python AWS or Pyspark AWS or Spark Scala

Experience:
The candidate should have about 2+ years of experience with design and development in Java/Scala. Experience in algorithm, data-structure, database and distributed System is mandatory.
Required Skills:
Mandatory: -
- Core Java or Scala
- Experience in Big Data, Spark
- Extensive experience in developing spark job. Should possess good Oops knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complexity of spark job.
- Working knowledge of Unix/Linux.
- Hands on experience in Spark, creating RDD, applying operation - transformation-action
Good To have: -
- Python
- Spark streaming
- Py Spark
- Azure/AWS Cloud Knowledge of Data Storage and Compute side
Your Opportunity
- Own and drive business features into tech requirements
- Design & develop large scale real time server side systems
- Quickly create quality prototypes
- Staying updated on emerging technologies
- Ensuring that all deliverables adhere to our world class standards
- Promote coding best practices
- Mentor and develop junior developers in the team
Required Experience:
- 4+ years of relevant experience as described below
- Excellent grasp of Core Java, Multi Threading and OO design patterns
- Experience with Scala, functional, reactive programming and Akka/Play is a plus
- Excellent understanding of data structures and algorithms
- Solid grasp of large scale distributed real time systems
- Prior experience on building a scalable and resilient micro service
- Solid understanding of relational databases, NoSQL databases and Caching systems
- Good understanding of Big Data technologies such as Spark, Hadoop is a plus
- Experience on one of AWS, Azure or GCP
Who you are :
- You have excellent and effective communication and collaborative skills
- You love problem solving
- You stay up to date with the latest technologies and then apply them in real life
- You love paying attention to detail
- You thrive in meeting tight deadlines and prioritising workloads
- Ability to collaborate across multiple functions
Education:
Bachelor’s degree in Engineering or equivalent experience within the field
About Vymo
Vymo is a Sanfrancisco-based next-generation Sales productivity SaaS company with offices in 7 locations. Vymo is funded by top tier VC firms like Emergence Capital and Sequoia Capital. Vymo is a category creator, an intelligent Personal Sales Assistant who captures sales activities automatically, learns from top performers, and predicts ‘next best actions’ contextually. Vymo has 100,000 users in 60+ large enterprises such as AXA, Allianz, Generali.Vymo has seen 3x annual growth over the last few years and aspires to do even better this year by building up the team globally.
What is the Personal Sales Assistant
A game-changer! We thrive in the CRM space where every company is struggling to deliver meaningful engagement to their Sales teams and IT systems. Vymo was engineered with a mobile-first philosophy. The platform through AI/ML detects, predicts, and learns how to make Sales Representatives more productive through nudges and suggestions on a mobile device. Explore Vymo https://getvymo.com/">https://getvymo.com/
What you will do at Vymo
From young open source enthusiasts to experienced Googlers, this team develops products like Lead Management System, Intelligent Allocations & Route mapping, Intelligent Interventions, that help improve the effectiveness of the sales teams manifold. These products power the "Personal Assistant" app that automates the sales force activities, leveraging our cutting edge location based technology and intelligent routing algorithms.
A Day in your Life
- Design, develop and maintain robust data platforms on top of Kafka, Spark, ES etc.
- Provide leadership to a group of engineers in an innovative and fast-paced environment.
- Manage and drive complex technical projects from the planning stage through execution.
What you would have done
- B.E (or equivalent) in Computer Sciences
- 6-9 years of experience building enterprise class products/platforms.
- Knowledge of Big data systems and/or Data pipeline building experience is preferred.
- 2-3 years of relevant work experience as technical lead or technical management experience.
- Excellent coding skills in one of Core Java or NodeJS
- Demonstrated problem solving skills in previous roles.
- Good communication skills.
- Minimum 7 years of relevant work experience in similar roles.
- Hands-on experience developing and delivering scalable multi-tenant SaaS applications on AWS platform.
- In-depth knowledge of Spring, Spring Boot, Java, REST Web Services, SQL databases, microservices, GRAND stack, SQL and NoSQL databases.
- In-depth knowledge and experience developing and delivering scalable data lakes, data ingestion and processing pipelines, data access microservices.
- In-depth knowledge of AWS platform, tools and services, specifically AWS networking and security, Route53, API Gateway, ECS/Fargate, RDS; Java/Spring development; modern database and data processing technologies; DevOps; microservices architecture; container/Docker technology.
- Outstanding collaboration and communication skills. Ability to effectively collaborate with distributed team.
- Understand and practice agile development methodology.
- Prior experience working in a software product company.
- Prior experience with security product development.
Nice to Have:
- AWS Certified Developer certification is highly desired.
- Prior experience with Apache Spark and Scala.


