Job Description
- Understanding of real estate market
- Designing and implementing of sales strategies for targeted revenue.
- Meetings prospective customers to present project & following up the sales
- Product knowledge by market research for better deliverable
- Achieve Sales targets for all Residential & Commercial Projects
- Respond to customer and broker queries and complaints Timely
- Identify potential customers, engage with them on a regular basis to convert the lead
- Direct Sales & Channel Sales
- Should have good and relevant experience in Sales.
- Good Communication Skills.

About Star Estate
About
Connect with the team
Similar jobs
Supercharge Your Career as a Oracle Integration Cloud (OIC) Developer at Technoidentity!
Are you ready to solve people challenges that fuel business growth? At Technoidentity, we’re a Data+AI product engineering company building cutting-edge solutions in the FinTech
domain for over 13 years—and we’re expanding globally. It’s the perfect time to join our team of tech innovators and leave your mark!
What’s in it for You?
We are seeking an experienced Oracle Integration Cloud (OIC) Developer to design, develop,and implement integrations and reporting solutions across Oracle Cloud and third-party
systems. This role requires strong hands-on expertise in Oracle middleware, BI Publisher, OTBI, and VBCS, with the ability to work collaboratively with business, functional, and technical teams in a fast-paced, global environment
What Will You Be Doing?
• Design and build integrations using Oracle Integration Cloud (OIC) to meet business and application requirements.
• Configure and deploy Oracle middleware solutions aligned with functional and technical design specifications.
• Develop technical specification documents based on Functional Design Documents (FDDs).
• Collaborate with business and functional teams to translate requirements into technical solutions and detailed test cases.
• Build BI Publisher reports, reconciliation reports, and drill-down dashboards to support analytical and operational needs.
• Develop OTBI reports and infolets for ad hoc analytics and performance tracking.
• Create and enhance VBCS applications and APEX web pages for effective data management by business users.
• Perform unit testing, system testing, and support QA processes to ensure robust, high-quality deliverables.
• Participate in design discussions to understand end-to-end business processes and continuously enhance technical capabilities.
What Makes You the Perfect Fit?
• 5–8 years of overall experience with Oracle middleware, including 3+ years hands-on
with Oracle Integration Cloud (OIC).
• Proven experience integrating Oracle ERP Cloud / E-Business Suite with third-party
applications.
• Strong proficiency in OIC, SQL, PL/SQL, Web Services (SOAP, REST), JSON/XML,
and XSLT processing.
• Hands-on experience in VBCS (Visual Builder Cloud Service) — mandatory.
• Expertise in Oracle BI Publisher (data models, templates, XSLT/RTF layouts, bursting,
and advanced report capabilities).
• Experience building OTBI reports, dashboards, and infolets.
• Familiarity with Oracle PaaS architecture, cloud security concepts, and SaaS
integrations (e.g., Oracle ERP Cloud, Workday).
• Experience with CI/CD pipelines using Jenkins, GitHub, and JIRA.
• Strong communication and interpersonal skills for cross-functional collaboration.
• Bachelor’s or Master’s degree in Engineering, Computer Applications, or a related
discipline.
Additional Details
• Shift: Night shift (aligned to US PST time zone).
• Work Mode: Remote (India-based).
• Engagement: Full-time
🤖 Data Scientist – Frontier AI for Data Platforms & Distributed Systems (4–8 Years)
Experience: 4–8 Years
Location: Bengaluru (On-site / Hybrid)
Company: Publicly Listed, Global Product Platform
🧠 About the Mission
We are building a Top 1% AI-Native Engineering & Data Organization — from first principles.
This is not incremental improvement.
This is a full-stack transformation of a large-scale enterprise into an AI-native data platform company.
We are re-architecting:
- Legacy systems → AI-native architectures
- Static pipelines → autonomous, self-healing systems
- Data platforms → intelligent, learning systems
- Software workflows → agentic execution layers
This is the kind of shift you would expect from companies like Google or Microsoft —
Except here, you will build it from day zero and scale it globally.
🧠 The Opportunity: This role sits at the intersection of three high-impact domains:
1. Frontier AI Systems: Large Language Models (LLMs), Small Language Models (SLMs), and Agentic AI
2. Data Platforms: Warehouses, Lakehouses, Streaming Systems, Query Engines
3. Distributed Systems: High-throughput, low-latency, multi-region infrastructure
We are building systems where:
- Data platforms optimize themselves using ML/LLMs
- Pipelines are autonomous, self-healing, and adaptive
- Queries are generated, optimized, and executed intelligently
- Infrastructure learns from usage and evolves continuously
This is: AI as the control plane for data infrastructure
🧩 What You’ll Work On
You will design and build AI-native systems deeply embedded inside data infrastructure.
1. AI-Native Data Platforms
- Build LLM-powered interfaces:
- Natural language → SQL / pipelines / transformations
- Design semantic data layers:
- Embeddings, vector search, knowledge graphs
- Develop AI copilots:
- For data engineers, analysts, and platform users
2. Autonomous Data Pipelines
- Build self-healing ETL/ELT systems using AI agents
- Create pipelines that:
- Detect anomalies in real time
- Automatically debug failures
- Dynamically optimize transformations
3. Intelligent Query & Compute Optimization
- Apply ML/LLMs to:
- Query planning and execution
- Cost-based optimization using learned models
- Workload prediction and scheduling
- Build systems that:
- Learn from query patterns
- Continuously improve performance and cost efficiency
4. Distributed Data + AI Infrastructure
- Architect systems operating at:
- Billions of events per day
- Petabyte-scale data
- Work with:
- Distributed compute engines (Spark / Flink / Ray class systems)
- Streaming systems (Kafka-class infra)
- Vector databases and hybrid retrieval systems
5. Learning Systems & Feedback Loops
- Build closed-loop AI systems:
- Execution → feedback → model updates
- Develop:
- Continual learning pipelines
- Online learning systems for infra optimization
- Experimentation frameworks (A/B, bandits, eval pipelines)
6. LLM & Agentic Systems (Infra-Aware)
- Build agents that understand data systems
- Enable:
- Autonomous pipeline debugging
- Root cause analysis for infra failures
- Intelligent orchestration of data workflows
🧠 What We’re Looking For
Core Foundations
- Strong grounding in:
- Machine Learning, Deep Learning, NLP
- Statistics, optimization, probabilistic systems
- Distributed systems fundamentals
- Deep understanding of:
- Transformer architectures
- Modern LLM ecosystems
Hands-On Expertise
- Experience building:
- LLM / GenAI systems (RAG, fine-tuning, embeddings)
- Data platforms (warehouse, lake, lakehouse architectures)
- Distributed pipelines and compute systems
- Strong programming skills:
- Python (ML/AI stack)
- SQL (deep understanding — query planning, optimization mindset)
Systems Thinking (Critical)
You think in systems, not components.
- Built or worked on:
- Large-scale data pipelines
- High-throughput distributed systems
- Low-latency, high-concurrency architectures
- Understand:
- Query optimization and execution
- Data partitioning, indexing, caching
- Trade-offs in distributed systems
🔥 What Sets You Apart (Top 1%)
- Built AI-powered data platforms or infra systems in production
- Designed or contributed to:
- Query engines / optimizers
- Data observability / lineage systems
- AI-driven infra or AIOps platforms
- Experience with:
- Multi-modal AI (logs, metrics, traces, text)
- Agentic AI systems
- Autonomous infrastructure
- Worked on systems at scale comparable to:
- Google (BigQuery-like systems)
- Meta (real-time analytics infra)
- Snowflake / Databricks (lakehouse architectures)
🧬 Ideal Background (Not Mandatory)
We often see strong candidates from:
- Data infrastructure or platform engineering teams
- AI-first startups or research-driven environments
- High-scale product companies
Experience building:
- Internal platforms used by 1000s of engineers
- Systems serving millions of users / high throughput workloads
- Multi-region, distributed cloud systems
🧠 The Kind of Problems You’ll Solve
- Can LLMs replace traditional query optimizers?
- How do we build self-healing data pipelines at scale?
- Can data systems learn from every query and improve automatically?
- How do we embed reasoning and planning into infrastructure layers?
- What does a fully autonomous data platform look like?
Background: We Commonly See (But Not Limited To)
Our team often includes engineers from top-tier institutions and strong research or product backgrounds, including:
- Leading engineering schools in India and globally
- Engineers with experience in top product companies, AI startups, or research-driven environments
- That said, we care far more about demonstrated ability, depth, and impact than pedigree alone.
JOB DESCRIPTION
Experience: 3-8 years
Wissen Technology is now hiring for a Java Developer - Mumbai with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills:
- Exp. - 3-8 years
- Experience in Core Java and Spring Boot.
- Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complex, low-latency client- facing applications.
- Good development experience with RDBMS.
- Good knowledge of multi-threading and high-performance server-side development.
- Basic working knowledge of Unix/Linux.
- Excellent problem solving and coding skills.
- Strong interpersonal, communication and analytical skills.
- Should have the ability to express their design ideas and thoughts.
We are looking for a trainer to deliver an advanced AI Training for Oracle Services, specifically leveraging the licensed tool Claude.
The scope of the training should cover the following topics:
- EBS Form Designing using AI
- APEX Form Designing using AI
- Report Generation (.rdf, Fusion BIP) using AI
- Report Bursting using AI
- PaaS/VBCS Form Personalization using AI tools
- Fusion Form Customization using Groovy Script and AI
- EBS to Fusion Form Conversion
- APEX to Fusion Form Conversion
The expected number of participants is 15, and the training is preferred to be conducted offline.
- Troubleshooting and Diagnosis. On-Site Troubleshooting
- Install and configure new POS hardware systems stores
- Cabling and Networking
- Strong knowledge of computer hardware (PC architecture, components, cabling)
- Understanding of basic networking concepts (TCP/IP, DHCP, Ping, IP configuration
CommVault Consulting Services helps customers overcome the inherent challenges of independently designing, planning, and building-out new modern data & information management environments.
We have an outstanding career opportunity for a successful Implementation Specialist to be part of our Professional Services team. This team member will be part of our professional services organization and will report directly to the Area Services Manager. This new member will be responsible for delivering solution deployments to our customers throughout the US and Canada. The perfect candidate will bring a positive attitude, efficient time management, innovative ideas, hard work ethic, and deliver quality customer service to our clients.
Job Description
How You Will Make an Impact
- Interface directly with clients to review and discuss deployments, expanding the scale of these projects
- Complete the scope of work as defined by client and sales team
- Validate all CommVault-completed tasks to ensure proper final configuration of the Commvault solution with customer
- Ensure customer satisfaction during implementation
- Assist team members, as needed
What You Need to Be Ready
- 7+ years of data protection experience
- 2+ years of consulting experience
- Experience with disk/tape storage hardware (HDS, Dell/EMC, NetApp, Oracle, Quantum, etc.)
- Cloud storage experience (Azure, AWS, Oracle)
- Proficiency with backup and recovery of Microsoft SQL, Exchange, and SharePoint
- Technical skills in Oracle, SAP, or other database platforms
- Previous Experience with backup software
- CommVault certified
- Bachelor’s degree
Job Description -
We are looking for a Quality Assurance Engineer with a Bachelor's in engineering, computer science, computer engineering, information technology, or a similar degree. Candidates should have experience in mobile application testing in Android/iOS platforms. As a Mobility QA Engineer, Will be responsible for creating and implementing a strategy for quality coordination and testing as well as suggesting solutions to identified quality problems in mobile application products.
Designation/Role Name: Software Test Engineer (Mobility)
Experience range: 1 to 5 years of experience
Responsibilities -
· Test current products and identify deficiencies
· Suggest solutions to identified product problems
· Investigate product quality in order to make improvements to achieve better customer satisfaction
· Plan, create and manage the overall Quality Planning strategy
· Collaborate with the Product Development team to ensure consistent project execution
· Identify quality assurance process bottleneck and suggest actions for improvement
· Oversee continuous improvement projects
· Collect quality data
· Identify key KPIs for product quality
· Prepare and present reports and metrics to Senior Management
· Capable of handling multiple tasks and projects in parallel with tight deadlines.
Skills/Expertise :
· Strong in software testing & QA concepts and test case writing with a break the system approach.
· Knowledge of bug tracking and project management tool JIRA.
· Working knowledge of mobile operating systems such as iOS and Android.
· Ability to test on different platforms,devices and cloud devices tools like Browserstack.
· Experience in developing automated test scripts in mobile app testing using tools like Appium and Selenium.
· Ability to develop test framework for mobile application testing using open source tools and technology.
· Scripting languages such as Java, python or Javascript.
· Experience in using configuration management or version control tools like GIT.
· Hands-on experience in Android Studio and xCode instrument for Android and iOS application error logs debugging and monitoring the memory leakage.
· Knowledge of identifying the API requests from the mobile clients using the Charles Proxy tool.
· Hands-on experience with IDEs like Eclipse , Android studio, IntelliJ IDEA.
· Knowledge of MYSQL and Database concepts for database testing.
Good to have :
· Experience in CI/CD tools like Jenkins will be an add-on.
· Knowledge of Codecept.js, Puppeteer test frameworks
ideated and built innovative web2 variants of games like Ludo, Snakes and Ladders,
Carrom, and more. We are now stepping into web3 space as we have figured out
exciting ways to make our games even more engaging by combining both of these
worlds together. We are looking to expand our team with energetic, self-motivated
people who are enthusiastic about web3, willing to learn, experiment and do what it
takes to make things happen. We are scouting for Blockchain Developers having
hands-on experience in Ethereum and sidechains like Polygon. You will be
responsible for designing, implementing, and supporting our blockchain based
Dapps.
Roles and Responsibilities :
● Analysing requirements, designing and implementing smart contracts around
a certain business model, as well as building & launching a new token
● Lead the technical design and implement web3 product roadmap
Must have:
● 1 to 3 years of experience in developing on Ethereum and Ethereum Layer2
networks and deployment of smart contracts on sidechains like Polygon
● Experience to write gas-optimized solidity smart contracts with provided
business logic
● Experience on Ethereum development frameworks such as Truffle, Hardhat,
and Ganache local blockchain
● Experience on token standards such as ERC20, ERC721, ERC1155, and
ERC777, etc.
● Good understanding of Blockchain platforms and consensus protocols
● Good experience with nodejs, mongodb, mysql, kafka, redis and GIT
● Strong software development background
● Experience working with large codebases
Good to have:
● Conceptual understanding of ICO, IEO, IDO, and STO
● Understanding of Defi tokens, launchpads, and automated market makers
(AAMs) such as Uniswap, Sushiswap, and Pancakeswap, etc.
● Good understanding of liquidity provisioning, token staking, and market
analysis.
● Experience with Docker and Kubernete












