11+ Focus Jobs in Bangalore (Bengaluru) | Focus Job openings in Bangalore (Bengaluru)
Apply to 11+ Focus Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Focus Job opportunities across top companies like Google, Amazon & Adobe.



What you'll do:
- Work closely with product managers and engineers to design, implement, test, and continually improve scalable frontend and backend services.
- Develop products using agile methods and tools.
- Develop commercial-grade software that is user-friendly and suitable for a global audience.
- Plan, create, and execute (manual and automated) tests.
- Be involved and participate in the overall application lifecycle.
- Building reusable code and libraries for future use.
- Staying up to date with current technologies and providing insights on cutting-edge software approaches, architectures, and vendors.
- Ensure that non-functional requirements such as security, performance, maintainability, scalability, usability, and reliability are being considered when architecting solutions.
Skills you bring to the table:
- Fluency in any one of JavaScript, TypeScript, or Python.
- Strong problem-solving skills.
- Should have built large scalable enterprise applications from scratch.
- Strong experience in architectural patterns, High-level designs.
- Experience in NoSQL and SQL DBs.
- You have a knack for launching and iterating on products quickly with quality and efficiency
- Willingness to learn and ability to flourish in a dynamic, high-growth, entrepreneurial environment
- Hands-on, self-starter, capable of working independently
- True love for technology and what you do
- Maniacal attention to detail
- 3+ years of experience


Hiring for Fullstack Developer - Shopify
Profile - Full stack Shopify Developer
Skills: Nodejs/Reactjs/ Angular , Shopify, Mysql, Rest API
Experience: 3+yrs
Location: Bangalore(WFO)
Notice: Immediate to 1 week
Interested ones kindly share cv at jyoti.kaur(at)programming.com
Thank You
Jyoti
We have openings for Fullstack / Backend / Frontend Developers who can write reliable, scalable, testable and maintainable code.
At Everest, we innovate at the intersection of design and engineering to produce outstanding products. The work we do is meaningful and challenging - which makes it interesting. Imagine each line of your code, making the world a better place. We work on five workdays weeks, and overtime is a rarity. If clean architecture, TDD, DDD, DevOps, Microservices, Micro-frontends, scalable systems resonate with you, please apply.
To see the quality of our code, you can checkout some of our open source projects: https://github.com/everest-engineering
If you want to know more about our culture:
https://github.com/everest-engineering/manifesto
Some videos that can help:
https://www.youtube.com/watch?v=A7y9RpqXAdA;
https://youtu.be/PPjyP1WPOn8" target="_blank">https://youtu.be/PPjyP1WPOn8
- Passion to own and create amazing product.
- Should be able to clearly understand the customer's problem.
- Should be a collaborative problem solver.
- Should be able a team player.
- Should be open to learn from others and teach others.
- Should be a good problem solver.
- Should be able to take feedback and improve continuously.
- Should commit to inclusion, equity & diversity.
- Should maintain integrity at work
-
Familiarity with Agile methodologies and clean code.
-
Design and/or contribute to client-side and server-side architecture.
-
Well versed with fundamentals of REST.
-
Build the front-end of applications through appealing visual design.
-
Knowledge of one or more front-end languages and libraries (e.g. HTML / CSS, JavaScript, XML, jQuery, Typescript) JavaScript frameworks (e.g. Angular, React, Redux, Vue.js)
-
Knowledge of one or more back-end languages (e.g. C#, Java, Python, Go, Node.js and frameworks like SpringBoot, .NET Core)
-
Well versed with fundamentals of database design.
-
Familiarity with databases - RDBMS like MySQL, Postgres & NoSQL like MongoDB, DynamoDB.
-
Well versed with one or more cloud platforms like - AWS, Azure, GCP.
-
Familiar with Infrastructure as Code - CloudFormation & Terraform & deployment tools like Docker, Kubernetes.
-
Familiarity with CI/CD tools like Jenkins, CircleCI, Github Actions..
-
Unit testing tools like Junit, Mockito, Chai, Mocha, Jest

India's leading financial services company
● Drive/Execute full stack Data driven Growth & marketing campaigns for our UPI business, focused on key metrics in terms of Strategy, Team management, Product Marketing, Growth management, and Revenue management.
●Should be responsible for driving the User Growth, Revenue Growth, Retention, and Engagement of the Platform on a day-to-day basis.
● Deploy successful marketing campaigns and take ownership right from ideation to implementation:
● Segment customer base on various transactional/ behavioral features & design effective & 'reach out' & programs for them in the form of these marketing campaigns.
● Manage end-to-end consumer lifecycle management for the business vertical.
● Responsible for driving the entire marketing funnel from Acquisition to Retention using various digital marketing channels.
● Monitor customer funnel & identify new levers of growth and apply them to the business
● Suggest new features/product innovations whenever necessary basis
understanding of consumer behaviour.
Superpowers/ Skills that will help you succeed in this role:
● Bachelors degree in engineering or related field from reputed colleges.
● Must be data driven with strong problem solving and
analytical skills.
● High degree of ownership in taking things to completion.
● Ability to multitask and work on diverse range of
requirements.
● Excellent communication skills with ability to handle complex
negotiations.
● Prior experience in e-commerce industry is a plus.
- 3-6 years of relevant work experience in a similar SDET role.
- Demonstrated experience in test framework design, development and having a strong QA mindset.
- Proficiency in any open source Automation Frameworks
- Hands-on experience with tools like JIRA or similar other tools.
- Define test strategies, analyse test results and validate the functionality. work with development, and product to gather test requirements, assess risk areas and convert these to test cases.
- Addition of new capabilities and features Test Cycle Management in the automation framework which is used by the entire QA team for test automation.
- Developing telemetry software to connect Junos devices to the cloud
- Fast prototyping and laying the SW foundation for product solutions
- Moving prototype solutions to a production cloud multitenant SaaS solution
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Build analytics tools that utilize the data pipeline to provide significant insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with partners including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics specialists to strive for greater functionality in our data systems.
Qualification and Desired Experiences
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- 5+ years experiences building data pipelines for data science-driven solutions
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Good team worker with excellent interpersonal skills written, verbal and presentation
- Create and maintain optimal data pipeline architecture,
- Assemble large, sophisticated data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Experience with AWS, S3, Flink, Spark, Kafka, Elastic Search
- Previous work in a start-up environment
- 3+ years experiences building data pipelines for data science-driven solutions
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- We are looking for a candidate with 9+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Proven understanding of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and interpersonal skills.
- Experience supporting and working with multi-functional teams in a multidimensional environment.




- Adept at Machine learning techniques and algorithms.
Feature selection, dimensionality reduction, building and
- optimizing classifiers using machine learning techniques
- Data mining using state-of-the-art methods
- Doing ad-hoc analysis and presenting results
- Proficiency in using query languages such as N1QL, SQL
Experience with data visualization tools, such as D3.js, GGplot,
- Plotly, PyPlot, etc.
Creating automated anomaly detection systems and constant tracking
- of its performance
- Strong in Python is a must.
- Strong in Data Analysis and mining is a must
- Deep Learning, Neural Network, CNN, Image Processing (Must)
Building analytic systems - data collection, cleansing and
- integration
Experience with NoSQL databases, such as Couchbase, MongoDB,
Cassandra, HBase
- Java, J2EE
- Microservices
- Spring Boot
- JPA, REST API, JSON, JWT, OAuth, Spring Security, Swagger
- Oracle / MySQL / Postgres DB
- Angular / React / React Native
- JIRA, Bitbucket
Must-Have -
- Excellent knowledge of Core Java and Spring
- Strong experience in REST API and web services
- Experience in Oracle / Mysql / Postgres DB
- Should be efficient with Java J2EE and related technologies
- Should have very good communication skills and analytical skills.
- Should have good knowledge in Software Development Life Cycle and Agile methodologies.
- Good knowledge of current / emerging technologies and trends
- Good analytical and problem-solving skills
- Excellent written and verbal communication skills. High levels of initiative and creativity
- Good communication skills with all stakeholders, good team player with the ability to mentor juniors
Good to Have:
- Knowledge in Banking Domain
- Full-stack developers with knowledge and understanding of Javascript and associated technologies like React, Angular, HTML5, CSS will have an advantage.
- Data Steward :
Data Steward will collaborate and work closely within the group software engineering and business division. Data Steward has overall accountability for the group's / Divisions overall data and reporting posture by responsibly managing data assets, data lineage, and data access, supporting sound data analysis. This role requires focus on data strategy, execution, and support for projects, programs, application enhancements, and production data fixes. Makes well-thought-out decisions on complex or ambiguous data issues and establishes the data stewardship and information management strategy and direction for the group. Effectively communicates to individuals at various levels of the technical and business communities. This individual will become part of the corporate Data Quality and Data management/entity resolution team supporting various systems across the board.
Primary Responsibilities:
- Responsible for data quality and data accuracy across all group/division delivery initiatives.
- Responsible for data analysis, data profiling, data modeling, and data mapping capabilities.
- Responsible for reviewing and governing data queries and DML.
- Accountable for the assessment, delivery, quality, accuracy, and tracking of any production data fixes.
- Accountable for the performance, quality, and alignment to requirements for all data query design and development.
- Responsible for defining standards and best practices for data analysis, modeling, and queries.
- Responsible for understanding end-to-end data flows and identifying data dependencies in support of delivery, release, and change management.
- Responsible for the development and maintenance of an enterprise data dictionary that is aligned to data assets and the business glossary for the group responsible for the definition and maintenance of the group's data landscape including overlays with the technology landscape, end-to-end data flow/transformations, and data lineage.
- Responsible for rationalizing the group's reporting posture through the definition and maintenance of a reporting strategy and roadmap.
- Partners with the data governance team to ensure data solutions adhere to the organization’s data principles and guidelines.
- Owns group's data assets including reports, data warehouse, etc.
- Understand customer business use cases and be able to translate them to technical specifications and vision on how to implement a solution.
- Accountable for defining the performance tuning needs for all group data assets and managing the implementation of those requirements within the context of group initiatives as well as steady-state production.
- Partners with others in test data management and masking strategies and the creation of a reusable test data repository.
- Responsible for solving data-related issues and communicating resolutions with other solution domains.
- Actively and consistently support all efforts to simplify and enhance the Clinical Trial Predication use cases.
- Apply knowledge in analytic and statistical algorithms to help customers explore methods to improve their business.
- Contribute toward analytical research projects through all stages including concept formulation, determination of appropriate statistical methodology, data manipulation, research evaluation, and final research report.
- Visualize and report data findings creatively in a variety of visual formats that appropriately provide insight to the stakeholders.
- Achieve defined project goals within customer deadlines; proactively communicate status and escalate issues as needed.
Additional Responsibilities:
- Strong understanding of the Software Development Life Cycle (SDLC) with Agile Methodologies
- Knowledge and understanding of industry-standard/best practices requirements gathering methodologies.
- Knowledge and understanding of Information Technology systems and software development.
- Experience with data modeling and test data management tools.
- Experience in the data integration project • Good problem solving & decision-making skills.
- Good communication skills within the team, site, and with the customer
Knowledge, Skills and Abilities
- Technical expertise in data architecture principles and design aspects of various DBMS and reporting concepts.
- Solid understanding of key DBMS platforms like SQL Server, Azure SQL
- Results-oriented, diligent, and works with a sense of urgency. Assertive, responsible for his/her own work (self-directed), have a strong affinity for defining work in deliverables, and be willing to commit to deadlines.
- Experience in MDM tools like MS DQ, SAS DM Studio, Tamr, Profisee, Reltio etc.
- Experience in Report and Dashboard development
- Statistical and Machine Learning models
- Python (sklearn, numpy, pandas, genism)
- Nice to Have:
- 1yr of ETL experience
- Natural Language Processing
- Neural networks and Deep learning
- xperience in keras,tensorflow,spacy, nltk, LightGBM python library
Interaction : Frequently interacts with subordinate supervisors.
Education : Bachelor’s degree, preferably in Computer Science, B.E or other quantitative field related to the area of assignment. Professional certification related to the area of assignment may be required
Experience : 7 years of Pharmaceutical /Biotech/life sciences experience, 5 years of Clinical Trials experience and knowledge, Excellent Documentation, Communication, and Presentation Skills including PowerPoint
Roles & Responsibilities:
- Design and Build API/Micro services using Spring boot.
- Experience in Spring Reactive programming.
- Exposure to docker and containerization
- Developing enterprise grade highly scalable java-based application
- Writing test cases using Java testing framework like JUnit, Mockito.
- Proficient understanding of code version tools, such as Git, SVN
- Displaying initiative and an ability to lead others, and develop applications team disciplined manner
- Participating in application solutions including assisting with planning and architectural design, development, resolution of technical issues, and application rationalization.
- Utilizing and applying robust analytic thinking with the ability to identify, debug, and resolve technical issues.
- Achieving significant contributions within a small team of developers to lead teams and deliver solutions within an Agile methodology, whilst ensuring quality, timeliness and team-wide adherence to good architectural practice and guidelines.
- Good Experience in Application Software Design and Development, Object Oriented Analysis and Design (OOAD), Software Testing and Debugging.
- Conduct peer code reviews