Location: Pune
Required Skills : Scala, Python, Data Engineering, AWS, Cassandra/AstraDB, Athena, EMR, Spark/Snowflake
About Wissen Technology
About
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.
With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.
We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).
Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Connect with the team
Similar jobs
Requirements
- 3+ years work experience with production-grade python. Contribution to open source repos is preferred
- Experience writing concurrent and distributed programs, AWS lambda, Kubernetes, Docker, Spark is preferred.
- Experience with one relational & 1 non-relational DB is preferred
- Prior work in the ML domain will be a big boost
What You’ll Do
- Help realize the product vision: Production-ready machine learning models with monitoring within moments, not months.
- Help companies deploy their machine learning models at scale across a wide range of use-cases and sectors.
- Build integrations with other platforms to make it easy for our customers to use our product without changing their workflow.
- Write maintainable, scalable performant python code
- Building gRPC, rest API servers
- Working with Thrift, Protobufs, etc.
- Bachelor's or Master’s degree in Computer Science or equivalent area
- 10 to 20 years of experience in software development
- Hands-on experience designing and building B2B or B2C products
- 3+ years architecting SaaS/Web based customer facing products, leading engineering teams as software/technical architect
- Experiences of engineering practices such as code refactoring, microservices, design and enterprise integration patterns, test and design-driven development, continuous integration, building highly scalable applications, application and infrastructure security
- Strong cloud infrastructure experience with AWS and/or Azure
- Experience building event driven systems and working with message queues/topics
- Broad working experience across multiple programming languages and frameworks with in-depth experience in one or more of the following: .Net, Java, Scala or Go-lang
- Hands-on experience with relational databases like SQL Server, PostgreSQL and document stores like Elasticsearch or MongoDB
- Hands-on experience with Big Data processing technologies like Hadoop/Spark is a plus
- Hands-on experience with container technologies like Docker, Kubernetes
- Knowledge of Agile software development process
Project Overview: We are looking for expert level Postgres database developer to work on a software application development project for a fortune 500 US based telecom client. The application is web based and used across multiple teams to support their business processes. The developer will be responsible for developing various components of the Postgres database and for light administration of the database.
Key Responsibilities: Collaborate with onshore, offshore and other team members to understand the user stories and develop code. Develop and execute scripts to unit test. Collaborate with onshore developers, product owner and the client team to perform work in an integrated manner.
Professional Attributes: Should have the ability to work independently and seek guidance as and when necessary - Should have good communication skills - Flexible working in different time zones if necessary - Good team player - Mentoring juniors
Experience preferred:
- Extensive experience in Postgres database development (expert level)
- Experience in Postgres administration.
- Must have working experience with GIS data functionality
- Experience handling large datasets (50-100M tables)
- Preferred – exposure to Azure or AWS
- Must have skillsets for database performance tuning
- Familiarity with web applications
- Ability to work independently with minimal oversight
- Experience working cohesively in integrated teams
- Good interpersonal, communication, documentation and presentation skills.
- Prior experience working in agile environments
- Ability to communicate effectively both orally and in writing with clients, Business Analysts and Developers
- Strong analytical, problem-solving and conceptual skills
- Excellent organizational skills; attention to detail
- Ability to resolve project issues effectively and efficiently
- Ability to prioritize workload and consistently meet deadlines
- Experience working with onshore-offshore model
Experience:
The candidate should have about 2+ years of experience with design and development in Java/Scala. Experience in algorithm, data-structure, database and distributed System is mandatory.
Required Skills:
Mandatory: -
- Core Java or Scala
- Experience in Big Data, Spark
- Extensive experience in developing spark job. Should possess good Oops knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complexity of spark job.
- Working knowledge of Unix/Linux.
- Hands on experience in Spark, creating RDD, applying operation - transformation-action
Good To have: -
- Python
- Spark streaming
- Py Spark
- Azure/AWS Cloud Knowledge of Data Storage and Compute side
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.
Mentor To Go is an android and web based platform to help students improve their work readiness through self-learning sessions, personalised mentorship provided by mentor professionals from a range of careers, and work experience opportunities.
The technology platform implements functionalities such as: screening surveys of mentors and mentees, training content for mentors, matching through an algorithm, and mentorship through the delivery of structured activities.
The application technology architecture comprises of the following:
o a Python Django web application with NGinx and Gunicorn
o PostgreSQL database
o A React Android and Web app
o Rest APIs which interact between the Android app and the Django webserver
Our vision at Mentor To Go is to provide career mentorship to young people anywhere, anytime across the length and breadth of India.
We are looking for a backend software developer who is passionate about using web technologies to solve social problems and creating experiences that are elegant and effective. You will own the back end development of features of the Mentor To Go platform from conceptualisation through design and testing, working closely with other developers, the technical project manager, and the program operations team. You are user-centric, continuously demonstrate strategic & analytical abilities, and are laser focused on executing at speed. You must have the ability to succeed in a fast paced environment, where success is dependent on your ability to collaborate with cross-functional team members in a positive, productive, and transparent way.
Minimum requirements
- Bachelor’s degree in engineering preferably CS or related discipline
- 3+ years experience working in Unix/Linux environments building web and mobile applications
- Proficiency in developing and deploying cloud based Python/Postgresql applications preferably using frameworks such as Django
- Experience with design and development of moderately complex software projects
- Strong written and oral communication skills
- Familiarity with version control software such as Git
Desired requirements
- 5+ years of overall relevant work experience
- Experience building apps using React/React-Native
- Basic understanding of AWS cloud
- Ability to participate in technical discussions and help make technical trade-offs
About Mentor Together
https://mentortogether.org/">Mentor Together is India’s first and largest youth mentoring non-profit organization with a mission to facilitate empowering mentoring relationships and networks that help young people break the inequalities of opportunity and actualise their potential.
http://bit.ly/mentortogo">Mentor To Go is the world’s first mobile mentoring platform created by Mentor Together with the support of Cisco India, LinkedIn Social Impact, British Telecom and Sterlite Technologies.
About Vymo
Vymo is a Sanfrancisco-based next-generation Sales productivity SaaS company with offices in 7 locations. Vymo is funded by top tier VC firms like Emergence Capital and Sequoia Capital. Vymo is a category creator, an intelligent Personal Sales Assistant who captures sales activities automatically, learns from top performers, and predicts ‘next best actions’ contextually. Vymo has 100,000 users in 60+ large enterprises such as AXA, Allianz, Generali.Vymo has seen 3x annual growth over the last few years and aspires to do even better this year by building up the team globally.
What is the Personal Sales Assistant
A game-changer! We thrive in the CRM space where every company is struggling to deliver meaningful engagement to their Sales teams and IT systems. Vymo was engineered with a mobile-first philosophy. The platform through AI/ML detects, predicts, and learns how to make Sales Representatives more productive through nudges and suggestions on a mobile device. Explore Vymo https://getvymo.com/">https://getvymo.com/
What you will do at Vymo
From young open source enthusiasts to experienced Googlers, this team develops products like Lead Management System, Intelligent Allocations & Route mapping, Intelligent Interventions, that help improve the effectiveness of the sales teams manifold. These products power the "Personal Assistant" app that automates the sales force activities, leveraging our cutting edge location based technology and intelligent routing algorithms.
A Day in your Life
- Design, develop and maintain robust data platforms on top of Kafka, Spark, ES etc.
- Provide leadership to a group of engineers in an innovative and fast-paced environment.
- Manage and drive complex technical projects from the planning stage through execution.
What you would have done
- B.E (or equivalent) in Computer Sciences
- 6-9 years of experience building enterprise class products/platforms.
- Knowledge of Big data systems and/or Data pipeline building experience is preferred.
- 2-3 years of relevant work experience as technical lead or technical management experience.
- Excellent coding skills in one of Core Java or NodeJS
- Demonstrated problem solving skills in previous roles.
- Good communication skills.