
What the company wants:
They want someone who can build and manage relationships with real estate brokers. You’ll be the main point of contact for brokers, onboard new ones into the network, and work with them to bring in buyers and properties. Your job is to make sure brokers stay engaged, motivated, and consistently deliver results.
Role:
- Be the single point of contact (POC) for all brokers.
- Onboard new brokers into the company’s channel partner network.
- Build and maintain strong, long-term broker relationships.
- Work with brokers to bring in qualified buyers for listed properties.
- Encourage brokers to source high-potential properties for the company.
- Track broker performance and give feedback for improvement.
- Conduct regular broker training to align them with company processes.
- Stay updated on market trends and identify new opportunities.
- Travel frequently to meet brokers and keep them engaged.

Similar jobs
We are seeking a SAP QM Architect with strong pharmaceutical and life sciences domain expertise to lead the design and development of GxP-compliant Quality Management products and extensions on the SAP S/4HANA and SAP BTP platform.
This role focuses on building reusable, validated product components—not just project-based implementations. You will collaborate with SAP engineering and product teams to conceptualize and deliver digital quality solutions that are scalable, compliant, and aligned with GMP, GAMP 5, and 21 CFR Part 11 standards.
Key Responsibilities
- Lead product design and architecture for SAP QM modules and SAP Industry Cloud applications built for GxP environments.
- Translate pharmaceutical quality processes (In-Process Inspection, CAPA, Deviation, Stability, COA) into configurable SAP product features.
- Develop validation-ready product artifacts including audit trail, e-signature, and data integrity frameworks.
- Collaborate with product managers, domain SMEs, and developers to map regulatory requirements into product functionality.
- Define standard APIs and integration templates for LIMS, MES, and DMS using SAP BTP/CPI.
- Contribute to functional specifications, UI/UX design, and end-to-end validation for new product releases.
- Ensure all product builds follow GxP compliance and industry-standard product lifecycle management practices.
Required Experience & Skills
- 10–15 years of relevant experience in SAP QM, including 5+ years in GxP-regulated pharma or life sciences.
- Proven background in SAP product development, co-innovation, or accelerator/extension builds.
- Deep expertise in Inspection Planning, Sampling, Results Recording, CAPA, Stability Studies, and COA generation.
- Strong understanding of Computer System Validation (CSV), audit trail requirements, and validation documentation (IQ/OQ/PQ).
- Hands-on experience with SAP S/4HANA, SAP BTP, CPI, and Fiori/UI5.
- Exposure to Agile product lifecycle, CI/CD pipelines, and release management is preferred.
Education & Certifications
- Bachelor’s or Master’s in Engineering, Life Sciences, or Pharmaceutical Sciences.
- SAP QM certification and/or GxP, CSV, or GAMP 5 credentials preferred.
Job Title: Cybersecurity Agent Developer
Location: Bengaluru, India
Experience: 7+ Years
Employment Type: Full-time
About the Role:
We are seeking a highly skilled Cybersecurity Agent Developer with deep expertise in C/C++ and Golang or Rust to build and optimize high-performance security agents for Windows, Linux, and macOS platforms. This role requires a strong background in low-level system programming, performance tuning, and security-centric design to ensure effective monitoring, threat detection, and system protection across diverse environments.
Key Responsibilities:
- Design, develop, and maintain cross-platform endpoint security agents.
- Optimize agent performance to ensure minimal system overhead and real-time responsiveness.
- Implement system-level hooks and monitoring components including:
- Process monitoring
- File system and network activity tracking
- System telemetry collection
- Work with kernel-level APIs and frameworks, such as:
- ETW, WFP, WMI, MiniFilter (Windows)
- eBPF, auditd, fanotify, netfilter (Linux)
- EndpointSecurity framework, XPC, System Extensions (macOS)
- Build robust, secure inter-process communication (IPC) and data serialization mechanisms.
- Integrate agents with cloud-based security platforms via REST APIs, gRPC, and TLS.
- Collaborate with internal teams (threat intelligence, detection, response) to evolve agent capabilities.
- Perform in-depth debugging, profiling, and optimization using industry-standard tools.
Required Skills & Experience:
Core Programming:
- Strong proficiency in C/C++ and either Golang or Rust
- Solid experience in multi-threaded and asynchronous programming
Platform Expertise:
- Proven experience developing for Windows, Linux, and macOS
- Deep knowledge of system-level programming, including:
- Windows: WinAPI, ETW, WFP, WMI, MiniFilter
- Linux: eBPF, auditd, fanotify, netfilter
- macOS: EndpointSecurity framework, XPC, System Extensions
Security & Networking:
- Understanding of secure IPC, TLS, gRPC, and secure coding practices
- Familiarity with system hardening and secure memory management
Debugging & Optimization Tools:
- Proficient in using tools like GDB, LLDB, Valgrind, Perf, Wireshark, Sysinternals Suite
Version Control:
- Strong experience with Git (GitHub, GitLab)
Preferred Qualifications:
- Experience with cybersecurity frameworks like MITRE ATT&CK, Sysmon, YARA, Suricata
- Hands-on exposure to kernel/driver development
- Familiarity with EDR/XDR, sandboxing, and SIEM integrations
- Understanding of malware analysis and threat detection techniques
- Exposure to container security and cloud-native security agent development
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
- 4+ yrs of experience having strong fundamentals on Windchill Customization & Configurations, Reporting Framework Customization, Workflow Customization , Customer handling
- Strong Customization background around Form Processors, Validator, data utilities, Form Controllers etc.
- Strong Programming skills in Java/J2EE technologies – JavaScript, GWT, JQuerry, XMLs, JSPs, SQL etc.
- Deep Knowledge in Windchill architecture
- Experience in atleast one full lifecycle PLM implementation with Windchill.
- Should have strong coding skills in Windchill development and Customization, ThingWorx Navigate Development (Mandatory), Thing Worx Architecture Configurations Mashup creation, ThingWorx and Windchill Upgrade
- Should have Build and Configuration management (Mandatory) - HPQC \JIRA\Azure\SVN\GITHUB \ Ant
- Knowledge & Experience in Build and Release process
- Having worked on custom upgrade will be a plus.
- Understanding of application s development environment, Database, data management and infrastructure capabilities and constraints. Understanding of Database administration, Database design and performance Tuning
- Follow Quality processes for tasks with appropriate reviews. Participate in sharing knowledge within the team.
Responsibilities
- Take product ideas from ideation to implementation.
- Collaborate with Product and Design to create robust and usable features.
- Take ownership of the architecture for best performance and usability.
- Lead engineering discussions to ensure the best practices, maintainability and security of the application.
- Participate in design and code reviews.
- Lead and mentor the team of developers, to drive business objectives.
We are looking for
- 4+ years of experience building quality applications.
- Proficient in client-side JS frameworks such as React.js
- Passion for best design and coding practices and a desire to develop new bold ideas.
- Strong customer focus, ownership, and self-drive.
- Experience in planning, designing architecture and leading teams.
- Experience with SaaS products or an early stage startup is a plus.
XressBees – a logistics company started in 2015 – is amongst the fastest growing companies of its sector. Our
vision to evolve into a strong full-service logistics organization reflects itself in the various lines of business like B2C
logistics 3PL, B2B Xpress, Hyperlocal and Cross border Logistics.
Our strong domain expertise and constant focus on innovation has helped us rapidly evolve as the most trusted
logistics partner of India. XB has progressively carved our way towards best-in-class technology platforms, an
extensive logistics network reach, and a seamless last mile management system.
While on this aggressive growth path, we seek to become the one-stop-shop for end-to-end logistics solutions. Our
big focus areas for the very near future include strengthening our presence as service providers of choice and
leveraging the power of technology to drive supply chain efficiencies.
Job Overview
XpressBees would enrich and scale its end-to-end logistics solutions at a high pace. This is a great opportunity to join
the team working on forming and delivering the operational strategy behind Artificial Intelligence / Machine Learning
and Data Engineering, leading projects and teams of AI Engineers collaborating with Data Scientists. In your role, you
will build high performance AI/ML solutions using groundbreaking AI/ML and BigData technologies. You will need to
understand business requirements and convert them to a solvable data science problem statement. You will be
involved in end to end AI/ML projects, starting from smaller scale POCs all the way to full scale ML pipelines in
production.
Seasoned AI/ML Engineers would own the implementation and productionzation of cutting-edge AI driven algorithmic
components for search, recommendation and insights to improve the efficiencies of the logistics supply chain and
serve the customer better.
You will apply innovative ML tools and concepts to deliver value to our teams and customers and make an impact to
the organization while solving challenging problems in the areas of AI, ML , Data Analytics and Computer Science.
Opportunities for application:
- Route Optimization
- Address / Geo-Coding Engine
- Anomaly detection, Computer Vision (e.g. loading / unloading)
- Fraud Detection (fake delivery attempts)
- Promise Recommendation Engine etc.
- Customer & Tech support solutions, e.g. chat bots.
- Breach detection / prediction
An Artificial Intelligence Engineer would apply himself/herself in the areas of -
- Deep Learning, NLP, Reinforcement Learning
- Machine Learning - Logistic Regression, Decision Trees, Random Forests, XGBoost, etc..
- Driving Optimization via LPs, MILPs, Stochastic Programs, and MDPs
- Operations Research, Supply Chain Optimization, and Data Analytics/Visualization
- Computer Vision and OCR technologies
The AI Engineering team enables internal teams to add AI capabilities to their Apps and Workflows easily via APIs
without needing to build AI expertise in each team – Decision Support, NLP, Computer Vision, for Public Clouds and
Enterprise in NLU, Vision and Conversational AI.Candidate is adept at working with large data sets to find
opportunities for product and process optimization and using models to test the effectiveness of different courses of
action. They must have knowledge using a variety of data mining/data analysis methods, using a variety of data tools,
building, and implementing models, using/creating algorithms, and creating/running simulations. They must be
comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion
for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.
Roles & Responsibilities
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Building cloud services in Decision Support (Anomaly Detection, Time series forecasting, Fraud detection,
Risk prevention, Predictive analytics), computer vision, natural language processing (NLP) and speech that
work out of the box.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Build core of Artificial Intelligence and AI Services such as Decision Support, Vision, Speech, Text, NLP, NLU,
and others.
● Leverage Cloud technology –AWS, GCP, Azure
● Experiment with ML models in Python using machine learning libraries (Pytorch, Tensorflow), Big Data,
Hadoop, HBase, Spark, etc
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to
drive business solutions.
● Mine and analyze data from company databases to drive optimization and improvement of product
development, marketing techniques and business strategies.
● Assess the effectiveness and accuracy of new data sources and data gathering techniques.
● Develop custom data models and algorithms to apply to data sets.
● Use predictive modeling to increase and optimize customer experiences, supply chain metric and other
business outcomes.
● Develop company A/B testing framework and test model quality.
● Coordinate with different functional teams to implement models and monitor outcomes.
● Develop processes and tools to monitor and analyze model performance and data accuracy.
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Deliver machine learning and data science projects with data science techniques and associated libraries
such as AI/ ML or equivalent NLP (Natural Language Processing) packages. Such techniques include a good
to phenomenal understanding of statistical models, probabilistic algorithms, classification, clustering, deep
learning or related approaches as it applies to financial applications.
● The role will encourage you to learn a wide array of capabilities, toolsets and architectural patterns for
successful delivery.
What is required of you?
You will get an opportunity to build and operate a suite of massive scale, integrated data/ML platforms in a broadly
distributed, multi-tenant cloud environment.
● B.S., M.S., or Ph.D. in Computer Science, Computer Engineering
● Coding knowledge and experience with several languages: C, C++, Java,JavaScript, etc.
● Experience with building high-performance, resilient, scalable, and well-engineered systems
● Experience in CI/CD and development best practices, instrumentation, logging systems
● Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights
from large data sets.
● Experience working with and creating data architectures.
● Good understanding of various machine learning and natural language processing technologies, such as
classification, information retrieval, clustering, knowledge graph, semi-supervised learning and ranking.
● Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest,
Boosting, Trees, text mining, social network analysis, etc.
● Knowledge on using web services: Redshift, S3, Spark, Digital Ocean, etc.
● Knowledge on creating and using advanced machine learning algorithms and statistics: regression,
simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
● Knowledge on analyzing data from 3rd party providers: Google Analytics, Site Catalyst, Core metrics,
AdWords, Crimson Hexagon, Facebook Insights, etc.
● Knowledge on distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, MySQL, Kafka etc.
● Knowledge on visualizing/presenting data for stakeholders using: Quicksight, Periscope, Business Objects,
D3, ggplot, Tableau etc.
● Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
● Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests, and proper usage, etc.) and experience with applications.
● Experience building data pipelines that prep data for Machine learning and complete feedback loops.
● Knowledge of Machine Learning lifecycle and experience working with data scientists
● Experience with Relational databases and NoSQL databases
● Experience with workflow scheduling / orchestration such as Airflow or Oozie
● Working knowledge of current techniques and approaches in machine learning and statistical or
mathematical models
● Strong Data Engineering & ETL skills to build scalable data pipelines. Exposure to data streaming stack (e.g.
Kafka)
● Relevant experience in fine tuning and optimizing ML (especially Deep Learning) models to bring down
serving latency.
● Exposure to ML model productionzation stack (e.g. MLFlow, Docker)
● Excellent exploratory data analysis skills to slice & dice data at scale using SQL in Redshift/BigQuery.
We're looking for a Fullstack Engineer to work remotely for a product-based HealthTech Startup based in Mumbai.
Requirements
4+ years of professional full-stack software development experience.
[ React Native (2+ years) + Typescript is required ]
Experience working with NodeJS, Swift, Objective C, and Java is a plus but not required. Must have excellent English written and verbal communication.
- Work mode - Remote / Mumbai
- Salary upto 30 LPA + Esops
- Notice Period - 45 days or lesser will be preferred
- Interview process - 1 Online Test > Founder discussion 1 > Founder discussion 2 > Offer
Interested candidates can reach out and apply here
Are you passionate about using technology to make people's lives better? Are you interested in becoming a part of one of the hottest trends in the world of start-ups today? Are you excited about joining the online ultra-fast grocery delivery service business pioneer and driving the trend forward? Then this may be the right opportunity for you.
Role and Responsibilities
- Lead over a product in the Technology department.
- Self-sufficiently lead the software development lifecycle: technical design, implementation, testing, deployment, monitoring.
- Work with cross-functional teams effectively to enable business growth.
- Recruit, train, retain, and simply make Blok a great yet challenging place to work for the people in your team.
- Mentor team, resulting in trickle-down happiness and efficiency.
Requirements
- You have at least 2 years of experience in a managerial position in a technology company.
- You’ve designed, built, scaled, and maintained production services, and know-how to compose a service-oriented architecture. You’ll lead by example by executing when necessary.
- Bias towards action. You believe that speed and quality aren't mutually exclusive. You've shown good judgment about shipping as fast as possible while still making sure that products are built in a sustainable, responsible way. You embrace the agile software development mindset.
- Hiring prowess. You're a strong leader who can attract talent around the world, raising the bar for excellence. You retained, mentored, and hired senior engineers and managers with a track record of building productive, world-class engineering teams.
- Mentorship. You know that the most important part of your job is setting the team up for success. Through motivating, mentoring, teaching, and reviewing, you help other specialists make sound architectural decisions, improve their code quality, and get out of their comfort zone.
- You have the ability to communicate clearly and concisely with others at all levels within the organization. You have experience with effectively creating alignments with multiple teams and operating well in ambiguity.
- Dedication. You care tremendously about keeping the experience consistent for users. You are your harshest critic and hold yourself personally accountable, jumping in and taking ownership of problems that might not even be in your team's scope.
- Experience within our tech-stack (Microservices, Node.js, Java, React, MongoDB, Kotlin, Swift, Redis, AWS, Docker, Kubernetes, RabbitMQ, Elasticsearch, WebSockets, etc.)
Who You Are
- Passionate about technology and making an impact.
- A perpetual learner, who stretches their boundaries and enjoys new ideas.
- A doer who takes initiative regardless of boundaries empowers their teams and works well in a cross-functional set-up.
We are currently hiring a professional, skilled Software Developer to develop, create, and modify applications software or specialized utility programs of our company.
Role & Responsiblities
- Researching, designing, implementing and managing software programs
- Testing and evaluating new programs
- Identifying areas for modification in existing programs and subsequently developing these modifications.
- Writing and implementing efficient code
- Bug fixing and debugging existing code.
- Understanding product requirements, architecture, design
Qualifications & Skills Required
- Tech(CS/IT)/BCA/MCA/Any specialized course in the Required field
- Must know C/C++/JAVA/.NET
- 0-2 year experience in software development..
For C/C++ developer
- Deep knowledge and hands on experience with web applications and programming languages like- C/C++
- Design, build, and maintain efficient, reusable, and reliable C++ code
- Should have good technical skills related to Email Services Operating System and File Systems.
- Working independently and multi-tasking effectively.
- Understanding of projects from client as well as own view point
- The ability to learn new technologies quickly.
- Help maintain code quality, organization, and automatization
- Enhance product test and build infrastructure.
For Java Developer.
- Must have strong background / knowledge in Core Java.
- Must Have skills : Core java,swing ,J2EE.
- Must know about Collection Multi Threading and File handling.
- Must know how to use Eclipse/Net Beans
- Very Polished professionals with strong Communication Skills
- Excellent design, coding and debugging skills .
- Troubleshoot variety of complex software problems
- Sound knowledge of Computer Science fundamentals - data structures, algorithms, operating system concepts.
Position :Software Developer
No. of position:10
CTC- Negotiable
Location: Dehradun
Job type: full time
Timing: 10am to 7pm








