

About uFaber Edutech
About
uFaber is one of the fastest growing Edutech company’s in India, specialising in personalised training with latest technologies and high quality content. Spearheaded by IITians with over 10 years of experience in education, uFaber has one of the largest course catalogues across languages, entrance exams and skill enhancement subjects.We currently have over 1000 hours of video content and a customer base of 50000 + students.
Connect with the team
Similar jobs
Key Responsibilities:
- Rewrite existing APIs in NodeJS.
- Remodel the APIs into Micro services-based architecture.
- Implement a caching layer wherever possible.
- Optimize the API for high performance and scalability.
- Write unit tests for API Testing.
- Automate the code testing and deployment process.
Skills Required:
- At least 3 years of experience developing Backends using NodeJS — should be well versed with its asynchronous nature & event loop, and know its quirks and workarounds.
- Excellent hands-on experience using MySQL or any other SQL Database.
- Good knowledge of MongoDB or any other NoSQL Database.
- Good knowledge of Redis, its data types, and their use cases.
- Experience with graph-based databases like GraphQL and Neo4j.
- Experience developing and deploying REST APIs.
- Good knowledge of Unit Testing and available Test Frameworks.
- Good understanding of advanced JS libraries and frameworks.
- Experience with Web sockets, Service Workers, and Web Push Notifications.
- Familiar with NodeJS profiling tools.
- Proficient understanding of code versioning tools such as Git.
- Good knowledge of creating and maintaining DevOps infrastructure on cloud platforms.
- Should be a fast learner and a go-getter — without any fear of trying out new things Preferences.
- Experience building a large-scale social or location-based app.
We are looking for Middle and Senior level Female Java technologists at Bangalore.
These openings are with a Leading US MNC's Bangalore IDC.
This is an IC role. Need handson candidates only.
- Strong work experience in CORE JAVA
- Proficiency with Algorithms, Data Structures, Multi-threading, OOPS,
- Strong knowledge in areas like Data structures, Algorithms, Core Java & Java Fundamentals, Multithreading, IPC
- Strong knowledge of any OO language (Java, Core Java)
- Experience in database programming, mySQL experience is a plus .
SKILLS/EXPERIENCE :
Job : Senior Developers / Lead Engineer / Architect : 3.5 - 12 Yrs
Location: Bangalore
Technology skill set :
- Required Skills: Core Java, Practical implementation of Object oriented constructs & paradigms, Datastructures, Algorithms, Multi-threading, Collections, Design
Expertise :
- Required: Java Specialists with expertise in implementing object oriented constructs & paradigms in a multi-threaded environment using Core Java .
Responsibilities :
- Responsible for designing complex systems
- Independently act as a technical expert
- Handle multiple tasks at one time
Eligibility Criteria :
- Strong background in Java / Core Java. Hands on in coding.
- Bachelor's degree in computer science or a related field
- Proficient in OOPS, OOAD, Data Structures, Algorithms, Multithreading
- Database programming skill in SQL/Sybase/Oracle is a plus

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses.
Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.
As Conviva is expanding, we are building products providing deep insights into end user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva's products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
- Lead a team to develop a feature or parts of the product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 9+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their businesses ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision, and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!


Strong fundamentals in OOPS, RDBMS, Data structures, and Design patterns
Experience in any SQL database (Preferably SQL Server)
Ability to write clean code
Ability to write automated unit tests using frameworks such as NUnit, Mock, etc.
Ability to communicate and articulate clearly and work collaboratively in an agile team
Experience with code repositories like git
Ability to communicate and articulate clearly and work collaboratively in an agile team



● We believe that the role of an engineer at a typical product company in India has to evolve from just working in a request response mode to something more involved.
● Typically an engineer has very little to no connection with the product, its users, overall success criteria or long term vision of the product that he/she is working on.
● The system is not setup to encourage it. Engineers are evaluated on their tech prowess and very little attention is given to other aspects of being a successful engineer.
● We don’t hold appraisals as we don’t believe that evaluation of work and feedback is a constant affair rather than every 6 or 12 months. Besides there is no better testament of your abilities than the growth of the product.
● We don’t have a concept of hierarchy and hence we don’t have promotions. All we have in Udaan are Software Engineers.
Skills & Knowledge:
○ 4-15 years of experience
○ Sound knowledge in Programming,
○ High Ownership & Impact oriented
○ Creative thinker & Implementation
○ Highly Customer Obsessed & Always Insisting on Highest Standards
Job title- SSE/Associate Technical Lead/Technical Lead
Experience
-
-
Extensive product development experience
-
4-8 years experience in back end Java Development
-
Possess advanced knowledge of object-oriented design and development (OOA/OOD).
-
Team task assignment, mentoring and helping resolve issues
-
Self-starter, ability to work with minimal supervision.
-
Skilled at working collaboratively in a team-oriented environment.
-
Excellent problem-solving skills. Curious and adept at researching project-related issues and challenges.
-
Excellent debugging skills
-
-
-
Experience in SOA and microservice deployment model.
-
GWT development experience
-
Experience of working in an Agile development model
-
Mixed experience in different type of organizations - product start up and large enterprises
-
Other Skills
-
Strong Analytical skills
-
Acumen for understanding customer's business goals
-
Process oriented- Following current processes and partner in process improvement.
-
Data oriented- Using data and knowledge base for solving current problem at hand and creating software knowledge base for future use.
-
Ability to juggle multiple development and design tasks simultaneously.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for technical leaders with passion and experience in architecting and delivering high-quality distributed systems at massive scale.
Responsibilities & ownership
- Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
- Lead and mentor others about concurrency, parallelization to deliver scalability, performance and resource optimization in a multithreaded and distributed environment
- Propose and promote strategic company-wide tech investments taking care of business goals, customer requirements, and industry standards
- Lead the team to solve complex, unknown and ambiguous problems, and customer issues cutting across team and module boundaries with technical expertise, and influence others
- Review and influence designs of other team members
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Partner with other leaders to nurture innovation and engineering excellence in the team
- Drive priorities with others to facilitate timely accomplishments of business objectives
- Perform RCA of customer issues and drive investments to avoid similar issues in future
- Collaborate with Product Management, Support, and field teams to ensure that customers are successful with Dremio
- Proactively suggest learning opportunities about new technology and skills, and be a role model for constant learning and growth
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 15+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models and their use in developing distributed and scalable systems
- 8+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Subject Matter Expert in one or more of query processing or optimization, distributed systems, concurrency, micro service based architectures, data replication, networking, storage systems
- Experience in taking company-wide initiatives, convincing stakeholders, and delivering them
- Expert in solving complex, unknown and ambiguous problems spanning across teams and taking initiative in planning and delivering them with high quality
- Ability to anticipate and propose plan/design changes based on changing requirements
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Hands-on experience of working projects on AWS, Azure, and GCP
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and GCP)
- Understanding of distributed file systems such as S3, ADLS or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
• Bachelor's or Master’s degree in Computer Science or related.
• 5+ years of professional development experience.
• Deep experience in web applications, object-oriented programming, web services, REST,
Cloud computing, AWS/Azure, node.js, full-stack development
• Experience with multiple programming languages and frameworks including at least one
of JavaScript/HTML/CSS, Java, ReactJS, Python
• Experience in designing, developing and managing large scale web services
• Advance JavaScript knowledge is a must.
• Experience designing APIs and frameworks that are used by others
• Familiar with Git, Confluence, and Jira
• Exceptional problem-solving skills, with experience in defining and understanding complex
system architectures and design patterns
• Excellent communication skills. Be able to articulate technical decisions and produce
excellent technical documents
• Experience creating and maintaining unit tests and continuous integration
• Contribution to open source is a plus
• Experience developing cross-platform applications is a plus



