11+ Functional programming Jobs in Bangalore (Bengaluru) | Functional programming Job openings in Bangalore (Bengaluru)
Apply to 11+ Functional programming Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Functional programming Job opportunities across top companies like Google, Amazon & Adobe.
Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses.
Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.
As Conviva is expanding, we are building products providing deep insights into end user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva's products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
- Lead a team to develop a feature or parts of the product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 9+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their businesses ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision, and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!
Experience in Core Java and Spring Boot.
• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.
• Good development experience with RDBMS.
• Good knowledge of multi-threading and high-performance server-side development.
• working knowledge of Kafka
• Excellent problem solving and coding skills.
• Strong interpersonal, communication and analytical skills.
• Should have the ability to express their design ideas and thoughts.
We're looking for highly skilled experienced engineers to design and build high-scale, cloud-based data processing systems that can handle massive amounts of data with low latency. You'll work with a team of smart, motivated, and diverse people and be given the autonomy and support to do your best work. This is a rare opportunity to make a meaningful impact in society while working in a dynamic and flexible workplace where you'll belong and be encouraged.
Qualifications:
- Bachelor's Degree required
- Significant experience with distributed systems.
- Experience with modern programming languages such as Java, C#, C/C++, or Ruby.
- Experience with container platforms such as DC/OS, Kubernetes
- Fluency in technologies and design concepts around Big Data processing and relational databases, such as the Hadoop ecosystem, Map/Reduce, stream processing, etc.
- Experience with production operations and good practices for putting quality code into production and troubleshooting issues when they arise.
- Effective communication of technical ideas verbally and in writing, including technical proposals, design specs, architecture diagrams, and presentations.
- Ability to collaborate effectively with the team and other stakeholders.
- Preferably, production experience with Cloud and data processing technologies.
Responsibilities:
As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. Define specifications for significant new projects and specify, design and develop software according to those specifications. You will perform professional software development tasks associated with the developing, designing and debugging of software applications or operating systems.
- Design and build distributed, scalable, and fault-tolerant software systems.
- Build cloud services on top of the modern OCI infrastructure.
- Participate in the entire software lifecycle, from design to development, to quality assurance, and to production.
- Invest in the best engineering and operational practices upfront to ensure our software quality bar is high.
- Optimize data processing pipelines for orders of magnitude higher throughput and faster latencies.
- Leverage a plethora of internal tooling at OCI to develop, build, deploy, and troubleshoot software.
CTC Budget: 35-55LPA
Location: Hyderabad (Remote after 3 months WFO)
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
- 6 plus years of experience as a Python developer.
- Experience in web development using Python and Django Framework.
- Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
- Experience in developing User Interface using HTML, JavaScript, CSS.
- Experience in server-side templating languages including Jinja 2 and Mako
- Knowledge into Kafka and RabitMQ (GTH)
- Experience into Docker, Git and AWS
- Ability to integrate multiple data sources into a single system.
- Ability to collaborate on projects and work independently when required.
- DB (MySQL, Postgress, SQL)
Selection Process: 2-3 Interview rounds (Tech, VP, Client)
- 3-8+ years of experience programming in a backend language (Java / Python), with a good understanding of troubleshooting errors.
- 5+ years of experience in Confluent Kafka / 3+ years of experience in Confluent Kafka
- Cloud Kafka, Control Central, Rest Proxy, HA Proxy, Confluent Kafka Connect, Confluent Kafka Security features
- 2 + years of software development experience utilizing Java based frameworks and technologies to build enterprise grade product solutions.
- 2+ years of experience designing, developing and documenting RESTful APIs.
- Strong understanding of concepts/technologies like Spring MVC, Spring boot, J2EE, EJB, application/API security, API governance/gateway platforms like apigee or kong.
- Good understanding of RDMS concepts and development preferably using MS SQL Server.
- Experience using test automation technologies like cucumber, selenium is a big plus.
- Partner with scrum masters to address technical blockers/impediments to progress.
- Prior experience working with agile scrum-based development methodology. Participate in sprint planning and estimate development efforts for features and stories.
- Partner with DevOps to install and configure/tune application containers like embedded Spring boot tomcat and webserver technologies preferably nginx or apache.
- Partner with cloud engineering group to outline the infrastructure provisioning requirements/needs for new product development.
- Design and develop CI/CD tools and process preferably using Jenkins to configure build jobs for APIs and design pipeline to promote artifacts from development all the way to production
- Experience with messaging technologies like kafka is a huge plus
- Work with Test Automation engineering team to integrate test automation scripts as part of the CI/CD process.
- Prior proven experience of utilizing and productionizing container/container orchestration technologies like Docker, kubernetes.
- Participate and Resolve issues related deployments, application performance etc.
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.
Top Skills
- You write high quality, maintainable, and robust code, often in Java or C++/C/Python/ROR/C#
- You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
- You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities
- You solve problems at their root, stepping back to understand the broader context.
- You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
- You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
- You recognize and use design patterns to solve business problems.
- You understand how operating systems work, perform and scale.
- You continually align your work with Amazon’s business objectives and seek to deliver business value.
- You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
- You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
- You communicate clearly with your team and with other groups and listen effectively.
Skills & Experience
- Bachelors or Masters in Computer Science or relevant technical field.
- Experience in software development and full product life-cycle.
- Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
- Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
- Proficiency in SQL and data modeling.
Position Responsibilities:
1. Design, implementation, and deployment of applications
2. Expert knowledge in performance, scalability, enterprise system architecture, and engineering best practices.
3. Functionally decompose complex problems into simple, straight-forward solutions.
4. Work extensively with cross-functional teams across.
5. Work with the business team and project managers to convert functional requirements into detailed technical specifications.
6. The ideal candidate will be a leader, builder, and operator. He/she should be able to operate in a very fast-paced environment where time to hit the market is supercritical.
Desired Candidate profile:
1. A Bachelor's/Master’s degree in Computer Science or equivalent combination of technical education from Elite College or institution and work experience.
2. 4+ years of Software Development experience.
3. Excellent object-oriented design and coding skills (Java, C++ on a UNIX or Linux platform).
4. Very strong software development background including design patterns, data structures, test-driven development.
5. Ability to design and implement systems end to end on your own while maintaining highest coding standards.
6. Excellent knowledge of design patterns and ability to reflect it in their code.
7. Ability to lead projects and mentor junior engineers on the same.
8. Mandatory work experience in skills sets: REST API, JDBC, RDBMS (PostgreSQL, MySQL)
9. Solid Experience with distributed (multi-tiered) systems, algorithms, and relational databases.
10. Software development experience in Servlet, JSP, Spring, AWS, S3, SQS, building web services and highly scalable applications or Google Cloud Suite.
11. Excellent verbal and written communication skills.
About US: Newbie Soft Solutions is an IT service provider focused on providing solutions in niche areas to support and build future -ready, resilient solutions for medium sized industries and growth-focused technology organizations.
The name NEWBIE signifies a new chapter, a new beginning in the field of staffing solutions. Founded in 2015, we have grown from strength to strength with a strong presence across India, United States and Australia. Our offerings include Staffing Solutions, IT Consulting, Business Intelligence, Security Solutions, Legacy Application Management and Modernization. We value consistency, which is our core principle, to reach the end goal of complete user satisfaction. We constantly strive to outperform our competitors to become the leaders in digital revolution.
Job Requirement :
- Clear understanding of end to end communication of service calls via API Gateway/Service Mesh/Service Registry
- Experience on Springboot/SpringCloud/Restful Webservices
- Experience in containerisation (Docker) and Kubernetes in terms of creating container images and writing manifest files/helm charts on designing PODs/Side-car patterns etc.
- Good design experience on Web Applications (backend) & since we operate as a DevOps pod we would expect the person be involved in production deployments/support.
- Exposure to usage of CI-CD tools like Git/Jenkins/Maven/Sonar/Junit/CheckMarx/Netsparker/Cucumber
We, the Products team at DataWeave, build data products that provide timely insights that are readily consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are focused on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
- Opportunity to work on some of the most compelling data products that we are building for online retailers and brands.
- Ability to see the impact of your work and the value you are adding to our customers almost immediately.
- Opportunity to work on a variety of challenging problems and technologies to figure out what really excites you.
- A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible working hours.
- Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the team.
- Last but not the least, competitive salary packages and fast paced growth opportunities.
Roles and Responsibilities:
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrancy to the team. Push the envelope.
Skills and Requirements:
● 5-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.






