11+ Site survey Jobs in Mumbai | Site survey Job openings in Mumbai
Apply to 11+ Site survey Jobs in Mumbai on CutShort.io. Explore the latest Site survey Job opportunities across top companies like Google, Amazon & Adobe.
This role is responsible for architecting and implementing the Agentic capabilities of the PHI ecosystem. The engineer will lead the development of multi-agent systems, enabling seamless interoperability between AI agents, internal tools, and external services.
The position requires a strong focus on AI safety, secure agent orchestration, and tool-connected AI systems capable of executing complex workflows within the health insurance domain.
1. Agent Orchestration
- Build and manage autonomous AI agents using Agent Development Kit (ADK) and Vertex AI Agent Engine.
- Design and implement multi-agent workflows capable of handling complex tasks.
2. Interoperability
- Implement the Model Context Protocol (MCP) to enable connectivity between:
- AI agents
- Internal PHI tools
- External services and APIs.
3. Multimodal Development
- Build real-time, bidirectional audio applications using the Gemini Live API.
- Integrate image generation models and support multimodal AI capabilities.
4. Safety Engineering
- Implement AI safety layers to protect sensitive healthcare data.
- Use Model Armor and Cloud DLP API to:
- Sanitize prompts
- Prevent exposure of PII/PHI data
- Enforce secure AI interactions.
5. Agent-to-Agent (A2A) Communication
- Configure remote agent connectivity using the A2A SDK.
- Enable cross-agent collaboration and workflow orchestration.
Must-Have Skills
- Advanced proficiency with Agent Development Kit (ADK).
- Strong experience with Vertex AI Agent Engine.
- Hands-on experience with Model Context Protocol (MCP).
- Experience implementing Agent-to-Agent (A2A) workflows using the A2A SDK.
- Expertise in Google Gen AI SDK for Python.
- Experience building multimodal AI applications.
- Proven experience implementing AI safety layers, including:
- Model Armor
- Cloud DLP API
Good-to-Have Skills (Foundation)
Data & Analytics
- BigQuery optimization techniques, including:
- Partitioning
- Clustering
- Denormalization for performance and cost optimization.
Streaming & Real-Time Pipelines
- Experience building real-time data pipelines using:
- Google Pub/Sub
- BigQuery streaming pipelines
6 + years of hands-on development experience and in-depth knowledge of , Spring Java, Spring boot, Quarkus and nice to have front-end technologies like Angular, React JS
● Excellent Engineering skills in designing and implementing scalable solutions
● Good knowledge of CI/CD Pipeline with strong focus on TDD
● Strong communication skills and ownership
● Exposure to Cloud, Kubernetes, Docker, Microservices is highly desired.
● Experience in working on public cloud environments like AWS, Azure, GCP w.r.t. solutions development, deployment & adoption of cloud-based technology components like IaaS / PaaS offerings
● Proficiency in PL/SQL and Database development.
Strong in J2EE & OOPS Design Patterns.
Salesforce Developer (Classic to Lightning Migration)
Shift: Afternoon / Night (Work from Office)
Locations: Bangalore | Mumbai | Pune | Hyderabad | Mohali | Panchkula
Working Days: 5 Days
About the Role:
We are looking for an experienced Salesforce Developer to support the Client’s project migration from Salesforce Classic to Lightning. The ideal candidate will have hands-on experience in Lightning configuration, customization, and integrations, along with strong problem-solving and communication skills.
Key Responsibilities:
- Assist in the end-to-end migration from Classic to Lightning Experience.
- Work closely with technical leads and architects to implement scalable and efficient Lightning solutions.
- Redesign and optimize existing data models, replacing unnecessary custom objects with standard Salesforce entities like Leads and Opportunities.
- Configure Lightning features such as Dynamic Forms, Lightning Pages, Flows, and Omni-Channel.
- Build and maintain Apex classes, triggers, Lightning Web Components (LWC), and integrations as needed.
- Collaborate with admins, QA, and business stakeholders to ensure timely and quality delivery of enhancements.
- Ensure adherence to security best practices, including profiles, permission sets, and sharing rules.
- Contribute ideas to improve system usability and reduce custom technical debt.
Qualifications & Skills:
- 6 years of Salesforce development experience.
- Strong exposure to Salesforce Lightning (Aura / LWC) and Classic-to-Lightning migration projects (At least 3 years).
- Solid understanding of Service Cloud and Case Management processes.
- Experience with Flows, Dynamic Actions, AppExchange tools, and low-code/no-code configurations.
- Proficiency in Apex, Visualforce, SOQL, REST/SOAP APIs, and Integration patterns.
- Salesforce Platform Developer I (mandatory), Developer II or Application Architect (preferred).
- Excellent communication skills and ability to work in a cross-functional, global environment.
Job description:
Accountant
Civicon Ventures,a growing infrastructure services company, incorporated in February 2025, with a strong focus on railway sector projects. Headquartered in Ulhasnagar, Maharashtra, Civicon is actively engaged in both construction and maintenance assignments, working with government, semi-government, and private entities.
JD: We are currently looking for a Accountant to join our growing team at our Mumbai - Kayan/ Ullasnagar office.
Accountant – Role Overview
Key Responsibilities:
- Maintain accurate financial records, including ledgers and journals.
- Manage accounts payable and receivable – process vendor invoices, issue invoices, track payments.
- Perform regular bank and general ledger reconciliations.
- Assist in preparing monthly, quarterly, and annual financial reports.
- Handle daily financial transactions and ensure compliance with accounting standards.
Requirements:
- Bachelor’s degree in Accounting, Finance, or Commerce or CA
- 1–2 years of experience in a similar accounting role.
- Proficiency in tools like Tally ERP 9, QuickBooks, SAP, or Xero.
- Strong skills in Excel (pivot tables, VLOOKUP) and knowledge of Indian taxation laws (GST, TDS, Income Tax).
- Good communication skills in English and Hindi/Marathi.
Location: Ulhasnagar
Location: Mumbai (Andheri)
Work Mode: 5 Days Work from Office
Experience: 2+ Years
Employment Type: Full-Time
Notice Period: Immediate Joiners Preferred
About Deqode:
Deqode is a leading digital transformation and technology consulting company, helping businesses solve complex problems using cutting-edge technologies. We specialize in blockchain, AI/ML, cloud solutions, and enterprise mobility, with a strong focus on delivering scalable and innovative digital products for global clients.
Role Overview:
We are looking for a skilled React Native Developer to join our frontend engineering team. You will be responsible for building responsive, scalable, and high-performance mobile applications across Android and iOS platforms. You’ll collaborate with cross-functional teams including backend developers, designers, and product managers to deliver seamless mobile experiences.
Responsibilities:
- Develop and maintain robust mobile applications using React Native
- Build reusable and efficient components and libraries for future use
- Integrate applications with backend APIs and third-party services
- Optimize applications for maximum speed, responsiveness, and performance
- Ensure quality, maintainability, and scalability of your codebase
- Participate in the full mobile app lifecycle – from planning, development, testing to deployment
- Collaborate in an agile environment with daily stand-ups, code reviews, and regular releases
Must-Have Skills:
- Minimum 2+ years of hands-on experience with React Native
- Strong command over JavaScript (ES6+) and TypeScript
- Experience with state management tools like Redux, MobX, or Context API
- Familiarity with native build tools such as Android Studio, Xcode
- Experience integrating RESTful APIs and handling async operations
- Solid understanding of mobile UI/UX design principles
- Proficient in Git, version control workflows, and CI/CD pipelines
- Good grasp of cross-platform compatibility and responsiveness
Good to Have:
- Exposure to Node.js or full-stack development
- Familiarity with databases like MongoDB, MySQL, or Redis
- Experience in integrating analytics, push notifications, and app deployment
- Bonus: Knowledge of Vue.js
The Role:
As an ML Engineer at TIFIN, you will be responsible for driving research and innovation in a result-oriented direction. Your role will involve staying updated on the latest research trends and exploring advancements in Natural Language Understanding (NLU) and their applications in conversational AI. You will also play a mentoring role and contribute to improving NLU capabilities using transfer learning from historical conversational data.
Requirements:
- Experience with training and fine-tuning machine learning models on large text datasets.
- Strong computer science fundamentals and at least 3 years of software development experience.
- Track record of thinking big and finding simple solutions while dealing with ambiguity.
- Proven experience as a Natural Language Processing Engineer or in similar roles.
- Good understanding of NLP tricks and techniques for semantic extraction, data structure, and modeling.
- Familiarity with text representation techniques, algorithms, and statistics.
- Knowledge of programming languages such as R, Python, and Java.
- Proficiency in Machine Learning frameworks like TensorFlow, PyTorch, etc.
- Ability to design software architectures and solve complex problems.
- Strong analytical and problem-solving skills.
- Experience in projects related to information retrieval, machine comprehension, entity recognition, text classification, semantic frame parsing, or machine translation is a plus.
- Publications, patents, or conference talks in relevant fields are a bonus.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
-Ability to communicate effectively, verbally and in writing
-Have to take care of Walk-in Sales
-Able to sell membership to the members
-No target driven but achieving should be the key
-Able to make & keep complete records of the clients
-Handling web inquiries
-Communicating & assuring about our best products & services
-Build rapport & maintain a good relationship with existing customers
-Able to listen & understand client’s medical history, problems related & accordingly suggest them about
the Weight Loss Programme
Tele-marketing & mass mailers
-Explain and offer different packages to prospects coordinating with customers and dietitian.






