11+ Nexus Jobs in Mumbai | Nexus Job openings in Mumbai
Apply to 11+ Nexus Jobs in Mumbai on CutShort.io. Explore the latest Nexus Job opportunities across top companies like Google, Amazon & Adobe.

- Implement defined user interfaces and experiences based on technical specifications
- Reviews and develops requirements, generate designs, provides rough order of magnitude estimates, implements, and performs unit testing
- Collaborate with our team of educational application designers, graphic artists, backend systems engineers, QA, and operational teams to ensure the timely delivery of high-value educational software products
- Demonstrates strong communication and interpersonal skills to negotiate different perspectives and goals
- Holds themselves to a high standard of quality and works with the broader team to ensure that the products meet those standards
- Works independently on multiple projects at one time, balancing needs, and deadlines
- Troubleshoot problems of a complex nature
- Supports systems that he/she has, and has not implemented
- Keeps informed of technical advances. Proposes changes based on newer technology
Required Skills and Abilities:
- Minimum of four years of software engineering experience
- Minimum of three years of software engineering experience building multi-tier enterprise applications
- Demonstrated track record of designing, developing, and delivering single page, web-based applications based on Angular.io or other similar JS libraries
- Bachelor’s degree in a technical discipline or relevant work experience
- Experience with CI/CD tools like Jenkins, Git, and Nexus
- Excellent interpersonal and communication skills


Job Description
We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.
Preferred Qualifications
- Experience with microservices architecture.
- Knowledge of cloud platforms (AWS, Azure).
- Familiarity with Agile/Scrum methodologies.
- Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.
Requirment Details
Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
Proven experience as a Java Developer or similar role.
Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).
Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.
Familiarity with RESTful APIs and web services.
Understanding of version control systems (e.g., Git).
Solid understanding of object-oriented programming (OOP) principles.
Strong problem-solving skills and attention to detail.

We are looking for QA role who has experience into Python ,AWS,and chaos engineering tool(Monkey,Gremlin)
⦁ Strong understanding of distributed systems
- Cloud computing (AWS), and networking principles.
- Ability to understand complex trading systems and prepare and execute plans to induce failures
- Python.
- Experience with chaos engineering tooling such as Chaos Monkey, Gremlin, or similar
Design, implement, and improve the analytics platform
Implement and simplify self-service data query and analysis capabilities of the BI platform
Develop and improve the current BI architecture, emphasizing data security, data quality
and timeliness, scalability, and extensibility
Deploy and use various big data technologies and run pilots to design low latency
data architectures at scale
Collaborate with business analysts, data scientists, product managers, software development engineers,
and other BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction,
forecasting, clustering, and machine learning algorithms
Educational
At Ganit we are building an elite team, ergo we are seeking candidates who possess the
following backgrounds:
7+ years relevant experience
Expert level skills writing and optimizing complex SQL
Knowledge of data warehousing concepts
Experience in data mining, profiling, and analysis
Experience with complex data modelling, ETL design, and using large databases
in a business environment
Proficiency with Linux command line and systems administration
Experience with languages like Python/Java/Scala
Experience with Big Data technologies such as Hive/Spark
Proven ability to develop unconventional solutions, sees opportunities to
innovate and leads the way
Good experience of working in cloud platforms like AWS, GCP & Azure. Having worked on
projects involving creation of data lake or data warehouse
Excellent verbal and written communication.
Proven interpersonal skills and ability to convey key insights from complex analyses in
summarized business terms. Ability to effectively communicate with multiple teams
Good to have
AWS/GCP/Azure Data Engineer Certification

• Run the production environment by monitoring availability and taking a holistic view of
system health
• Build software and systems to manage platform infrastructure and applications
• Improve reliability, quality, and time-to-market of our suite of software solutions
• Measure and optimize system performance, with an eye toward pushing our capabilities
forward, getting ahead of customer needs, and innovating to continually improve
• Provide primary operational support and engineering for multiple large distributed
software applications
• Drive cross-team alignment across development teams around reliability initiatives
The ideal candidate must -
• Bachelor’s degree in computer science or other highly technical, scientific discipline
• Ability to program (structured and OO) with one or more high level languages, such as
Python, Java, C/C++, Ruby, and JavaScript
• Good experience with microservices architecture and serverless technologies
• Exposure to event driven architecture and state machines
• A proactive approach to spotting problems, areas for improvement, and performance
bottlenecks
Role - Sales Manager
- ShareNest is India's first technology-enabled, asset-light marketplace that makes second home ownership accessible, affordable and enjoyable for everyone. ShareNest's vision is to enrich lives by democratizing second home ownership, making it enjoyable and making it accessible for all. Our innovative co-ownership model is a unique combination of owning an aspirational, fully managed second home along with the benefit of financial upside from real estate investment.
- Founded by the Business Head of India's first PropTech Unicorn, ShareNest has secured $1.5 million in a seed round from marquee investors, industry leaders and unicorn founders which will help build a base for the business to scale it to new heights.
Strategic
Key Responsibilities :
- Create an action plan to seek out and target new customers and new sales opportunities, and initiate an action plan to approach and secure new business for the Company.
- Take accountability for the qualified leads, and see it through the sales funnel which includes site visits, OBM, client meetings etc. to achieve the sales target
- Curate prospective buyers and move customers through the sales funnel and cycle
- Build and maintain a productive and cooperative relationship with the clients
- Coordinate with Marketing Department to organize marketing and advertising events.
Required Skill-set
- Graduate from any field with 2-3 years of experience in real estate sales
- Should have excellent communications skills and interpersonal skills
- Track record of over-achieving sales target and a target-driven attitude
- Ability to work in a fast-paced environment, open to travel and take up new challenges
Designation- L1 Tech Support
Location – Kharghar
Experience - 2 to 5 years
L1 Tech Support: BE/Btech only.
- Must have minimum 2-3 years of support Experience on Financial transactions system
- Monitor and verify program execution, identification and communication of processing variances(e.g. run time, record counts, etc.) and potential issues
- Manage Production alerts and take necessary action as per the run-book
- Creation of incident tickets, and follow-up until closure as per Standard Operational Procedure
- Maintain and adhere production schedules to include ad-hoc requests
- Stopping and starting system processes
- Troubleshoot failures and determine best course of action
- Internal Stakeholder communications with respect to the daily production cycles
- Documenting all Operational activities (Known Error Database, reports, Troubleshooting steps, Operation manual, important mail communications)
- SLA Adherence.
- Good Understanding of SQL and scheduling.
- Familiarity/Experience with UNIX, LINUX, Windows environments
- Familiarity with scheduling and job monitoring tools
- Must be able to do work in all 3 shifts (morning/afternoon/night
If interested, kindly share updated resume, current CTC and Expected CTC.


------------------------
Solve problems in speech and NLP domain using advanced Deep learning and Machine Learning techniques. Few examples of the problems are -
* Limited resource Speaker Diarization on mono-channel recordings in noisy environment.
* Speech Enhancement to improve accuracy of downstream speech analytics tasks.
* Automated Speech Recognition for accent heavy audio with a noisy background.
* Speech analytic tasks, which include: emotions, empathy, keyword extraction.
* Text analytic tasks, which include: topic modeling, entity and intent extraction, opinion mining, text classification, and sentiment detection on multilingual data.
A typical day at work
-----------------------------
You will work closely with the product team to own a business problem. You will then model the business problem into a Machine Learning problem. Next you will do literature review to identify approaches to solve the problem. Test these approaches, identify the best approach, add your own insights to improve the performance and ship that to production!
What should you know?
---------------------------------
* Solid understanding of Classical Machine Learning and Deep Learning concepts and algorithms.
* Experience with literature review either in academia or industry.
* Proficiency in at least one programming language such as Python, C, C++, Java, etc.
* Proficiency in Machine Learning tools such as TensorFlow, Keras, Caffe, Torch/PyTorch or Theano.
* Advanced degree in Computer Science, Electrical Engineering, Machine Learning, Mathematics, Statistics, Physics, or Computational Linguistics
Why DeepAffects?
--------------------------
* You’ll learn insanely fast here.
* Esops and competitive compensation.
* Opportunity and encouragement for publishing research at top conferences, paid trips to attend workshop and conferences where you have published.
* Independent work, flexible timings and sense of ownership of your work.
* Mentorship from distinguished researchers and professors.


Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
- Track & effectively address queries with timeliness and accuracy.
- Answer incoming queries over phone and emails
- Resolve any customer complaints in a prompt and professional manner.
- Collaborate to ensure customer satisfaction.
- Should be able to work, manage & maintain the CRM software.
- Coordinate with internal managers for updating data
- Maintain complete and accurate customer correspondence data.