
Location: Pune
Required Skills : Scala, Python, Data Engineering, AWS, Cassandra/AstraDB, Athena, EMR, Spark/Snowflake

About Wissen Technology
About
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.
With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.
We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).
Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Connect with the team
Similar jobs

Mandatory (Experience 1) - Must have a minimum 4+ years of experience in backend software development.
Mandatory (Experience 2) -Must have 4+ years of experience in backend development using Python (Highly preferred), Java, or Node.js.
Mandatory (Experience 3) - Must have experience with Cloud platforms like AWS (highly preferred), gcp or azure
Mandatory (Experience 4) - Must have Experience in any databases - MySQL / PostgreSQL / Postgres / Oracle / SQL Server / DB2 / SQL / MongoDB / Ne
- Build campaign generation services which can send app notifications at a speed of 10 million a minute
- Dashboards to show Real time key performance indicators to clients
- Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
- Building highly available & horizontally scalable platform services for ever growing data
- Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
- Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
- You will build backend services and APIs to create scalable engineering systems.
- As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
- You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
- Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
- Identify and improvise areas of improvement through data insights and research.
- 2-5 years of experience in backend development and must have worked on Java/shell/Perl/python scripting.
- Solid understanding of engineering best practices, continuous integration, and incremental delivery.
- Strong analytical skills, debugging and troubleshooting skills, product line analysis.
- Follower of agile methodology (Sprint planning, working on JIRA, retrospective etc).
- Proficiency in usage of tools like Docker, Maven, Jenkins and knowledge on frameworks in Java like spring, spring boot, hibernate, JPA.
- Ability to design application modules using various concepts like object oriented, multi-threading, synchronization, caching, fault tolerance, sockets, various IPCs, database interfaces etc.
- Hands on experience on Redis, MySQL and streaming technologies like Kafka producer consumers and NoSQL databases like mongo dB/Cassandra.
- Knowledge about versioning like Git and deployment processes like CICD.
2. Design software and make technology choices across the stack (from data storage to application to front-end)
3. Understand a range of tier-1 systems/services that power our product to make scalable changes to critical path code
4. Own the design and delivery of an integral piece of a tier-1 system or application
5. Work closely with product managers, UX designers, and end users and integrate software components into a fully functional system
6. Work on the management and execution of project plans and delivery commitments
7. Take ownership of product/feature end-to-end for all phases from the development to the production
8. Ensure the developed features are scalable and highly available with no quality concerns
9. Work closely with senior engineers for refining and implementation
10. Manage and execute project plans and delivery commitments
11. Create and execute appropriate quality plans, project plans, test strategies, and processes for development activities in concert with business and project management efforts
The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus
Project Overview: We are looking for expert level Postgres database developer to work on a software application development project for a fortune 500 US based telecom client. The application is web based and used across multiple teams to support their business processes. The developer will be responsible for developing various components of the Postgres database and for light administration of the database.
Key Responsibilities: Collaborate with onshore, offshore and other team members to understand the user stories and develop code. Develop and execute scripts to unit test. Collaborate with onshore developers, product owner and the client team to perform work in an integrated manner.
Professional Attributes: Should have the ability to work independently and seek guidance as and when necessary - Should have good communication skills - Flexible working in different time zones if necessary - Good team player - Mentoring juniors
Experience preferred:
- Extensive experience in Postgres database development (expert level)
- Experience in Postgres administration.
- Must have working experience with GIS data functionality
- Experience handling large datasets (50-100M tables)
- Preferred – exposure to Azure or AWS
- Must have skillsets for database performance tuning
- Familiarity with web applications
- Ability to work independently with minimal oversight
- Experience working cohesively in integrated teams
- Good interpersonal, communication, documentation and presentation skills.
- Prior experience working in agile environments
- Ability to communicate effectively both orally and in writing with clients, Business Analysts and Developers
- Strong analytical, problem-solving and conceptual skills
- Excellent organizational skills; attention to detail
- Ability to resolve project issues effectively and efficiently
- Ability to prioritize workload and consistently meet deadlines
- Experience working with onshore-offshore model
Responsibilities:
• As a Senior Backend Engineer, you will design, implement and build server-side components that run seamlessly on the Tickertape product which is loved and used by millions of investors every day.
• You will partner with other engineers to build high-performance REST & WebSocket APIs to power our frontend experiences.
• Influences best practices in the team.
• Perform data analysis and troubleshoot technical issues with platforms, performance, data discrepancies.
Requirements:
• 5 - 7 years of experience
• Good programming skills with any of the programming languages like Go, Javascript/Typescript (Nodejs)
• A good understanding of RDBMS(PostgreSQL), NoSQL systems(Mongo, Elasticsearch), Time series DB(Influx, Timescale), Queuing Systems(Kafka, SQS), caching technologies(Redis), and cloud technologies(AWS) is a must
• Web development concepts - basics of REST APIs, server architecture
• Extremely good at problem-solving, interest in building things from scratch, and is a self-learner.
• Good team player and ability to collaborate with others.
• Interest (and/or experience) in the financial/stock market space - interest trumps experience

Job Description for Python Backend Developer
2 + years expertise in Python 3.7, Django 2 (or Django 3).
Familiarity with some ORM (Object Relational Mapper) libraries.
Able to integrate multiple data sources and databases into one system.
Integration of user-facing elements developed by front-end developers with server-side logic in Django (RESTful APIs).
Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
Knowledge of user authentication and authorization between multiple systems, servers, and environments
Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform
Able to create database schemas that represent and support business processes
Strong unit test and debugging skills.
Proficient understanding of code versioning tools such as Git.
The desirables optionals
Django Channels, Web Sockets, Asyncio.
Experience working with AWS or similar Cloud services.
Experience in containerization technologies such as Docker.
Understanding of fundamental design principles behind a scalable application (caching, Redis)
Role: Software Developer
Industry Type: IT-Software, Software Services
Employment Type Full Time
Role Category Programming & Design
Qualification: Any Graduate in Any Specialization
Key Skills – Python 3.7 Django 2.0 onwards , REST APIs , ORM, Front End for interfacing only ( curl, Postman, Angular for testing), Docker (optional), database (PostgreSQL), Github
- Minimum 7 years of relevant work experience in similar roles.
- Hands-on experience developing and delivering scalable multi-tenant SaaS applications on AWS platform.
- In-depth knowledge of Spring, Spring Boot, Java, REST Web Services, SQL databases, microservices, GRAND stack, SQL and NoSQL databases.
- In-depth knowledge and experience developing and delivering scalable data lakes, data ingestion and processing pipelines, data access microservices.
- In-depth knowledge of AWS platform, tools and services, specifically AWS networking and security, Route53, API Gateway, ECS/Fargate, RDS; Java/Spring development; modern database and data processing technologies; DevOps; microservices architecture; container/Docker technology.
- Outstanding collaboration and communication skills. Ability to effectively collaborate with distributed team.
- Understand and practice agile development methodology.
- Prior experience working in a software product company.
- Prior experience with security product development.
Nice to Have:
- AWS Certified Developer certification is highly desired.
- Prior experience with Apache Spark and Scala.


