
Job Role: Trade Operations Associate
Job Location: Gurugram (Onsite)
Job Type: Full Time.
Relevant Experience:
- Minimum 6 months experience in trading operations / OMS / EMS support.
- Exposure to trading platforms, exchange connectivity, order management systems.
- Experience handling connectivity issues, order failures, or trading incidents.
- Familiarity with UAT testing, system upgrades, API/FIX connectivity
Key Skills:
- Trading systems & market infrastructure knowledge
- OMS/EMS & FIX protocol
- Incident management
- System monitoring & reporting
- Risk controls & SOP adherence

About Growing Stars Consulting Private Limited
About
Company social profiles
Similar jobs
MUST-HAVES:
- Machine Learning + Aws + (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sage maker
- Notice period - 0 to 15 days only
- Hybrid work mode- 3 days office, 2 days at home
SKILLS: AWS, AWS CLOUD, AMAZON REDSHIFT, EKS
ADDITIONAL GUIDELINES:
- Interview process: - 2 Technical round + 1 Client round
- 3 days in office, Hybrid model.
CORE RESPONSIBILITIES:
- The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
- Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
- Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
- Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
- System Integration: Integrate models into existing systems and workflows.
- Model Deployment: Deploy models to production environments and monitor performance.
- Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
- Continuous Improvement: Identify areas for improvement in model performance and systems.
SKILLS:
- Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
- Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaos search logs, etc. for troubleshooting; Other tech touch points are Scylla DB (like BigTable), OpenSearch, Neo4J graph
- Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
- Knowledge of model monitoring and performance evaluation.
REQUIRED EXPERIENCE:
- Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sage maker pipeline with ability to analyze gaps and recommend/implement improvements
- AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
- AWS data: Redshift, Glue
- Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)
Job Title : Azure DevOps Engineer
Experience Required : 7+ Years
Work Mode : Remote / Hybrid
Location : Remote
Notice Period : Immediate Joiners / Serving Candidates (within 20 days only)
Interview Mode : Face-to-Face or Virtual
Open Positions : 2
Job Description :
We are seeking an experienced Azure DevOps Engineer with 7+ years of relevant experience in DevOps practices, especially around Azure infrastructure, deployment automation, and CI/CD pipeline management. The ideal candidate should have hands-on expertise with Azure DevOps, GitHub, YAML, and Azure services, along with solid communication and coordination capabilities.
Mandatory Skills : Azure DevOps, GitHub Actions, YAML, Bicep, Azure services (App Gateway, WAF, NSG, CosmosDB, Storage Accounts), Unix scripting, and Azure Fundamentals certification.
Key Responsibilities :
- Manage deployments for Dynamics 365 and proxy applications
- Run and maintain ADO pipelines and GitHub Actions
- Ensure proper status updates on ADO Boards and deployed work items
- Coordinate with QA teams to execute smoke testing post-deployment
- Communicate deployment progress across team channels effectively
- Monitor deployment cycles, approval gates, logs, and alerts
- Ensure smooth integration of infrastructure and DevOps practices
Mandatory Skills :
- Minimum 7+ years in DevOps, with strong experience in Azure DevOps (ADO).
- Proven expertise in building pipelines using Azure DevOps and GitHub.
- Proficiency in Bicep, YAML scripting, and Azure Infrastructure-as-Code (IaC).
- Hands-on with Azure services like :
- App Gateway, WAF, NSG, CosmosDB, Storage Accounts.
- vNet, Managed Identity, KeyVault, AppConfig, App Insights.
- Basic Azure Fundamentals Certification (AZ-900).
- Excellent communication skills in English.
Nice to Have :
- Experience in managing large enterprise-scale deployments.
- Familiarity with branching strategies and monitoring tools.
- Exposure to Approval Gates and Deployment Governance.
Company: - Re / Max Realty – Real Estate Company
We are Hiring – Real Estate Broker Agent
Project Focus: Farmhouses | Villas | Plots | Commercial Spaces
· Join Remax Realty – Earn Big in Real Estate with Us!
· Looking for Growth + Unlimited Income? Become our Real Estate Agent today!
· Turn Opportunities into Success – Join Our Real Estate Team Now!
· Flexible Hours, High Earnings – Build Your Career with Remax Realty !
If you have a passion for real estate, strong networking skills, and the drive to close deals, we want you on our team!
What We Offer:
High-commission structure
Premium projects in top locations
Growth & long-term association
Who Can Apply?
Freelancers specially working with us
Self-driven individuals with good communication skills
Prior real estate experience preferred (but not mandatory)
Join us and grow with one of the fastest-growing real estate firms.
🔍 Job Description:
We are looking for an experienced and highly skilled Technical Lead to guide the development and enhancement of a large-scale Data Observability solution built on AWS. This platform is pivotal in delivering monitoring, reporting, and actionable insights across the client's data landscape.
The Technical Lead will drive end-to-end feature delivery, mentor junior engineers, and uphold engineering best practices. The position reports to the Programme Technical Lead / Architect and involves close collaboration to align on platform vision, technical priorities, and success KPIs.
🎯 Key Responsibilities:
- Lead the design, development, and delivery of features for the data observability solution.
- Mentor and guide junior engineers, promoting technical growth and engineering excellence.
- Collaborate with the architect to align on platform roadmap, vision, and success metrics.
- Ensure high quality, scalability, and performance in data engineering solutions.
- Contribute to code reviews, architecture discussions, and operational readiness.
🔧 Primary Must-Have Skills (Non-Negotiable):
- 5+ years in Data Engineering or Software Engineering roles.
- 3+ years in a technical team or squad leadership capacity.
- Deep expertise in AWS Data Services: Glue, EMR, Kinesis, Lambda, Athena, S3.
- Advanced programming experience with PySpark, Python, and SQL.
- Proven experience in building scalable, production-grade data pipelines on cloud platforms.
Looking For Java Backend Developer
Greetings from Skilltasy !!!!
Urgent Requirement…
Job Role – Java Backend Developer
Experience – 6+ year
Work location –Bangalore/ Hyderabad
Notice Period – Immediate to 30 days
Skills -
· Develop enterprise-level applications and features using above mentioned technologies and tools.
· Collaborate with other developers, designers, and stakeholders to deliver highly functional features.
· Deliver on development requests while also setting accurate expectations, adhering to best practices, and creating modular and testable code.
· Adhere to an Agile methodology.
· Consider the user journey to create an intuitive user experience…
Please share profiles to
Unlock endless job opportunities today - Sign up at www.skilltasy.com/signup.
Achieving Branch targets(End to End Sales).
* Prospect generation through cold calling.
* Meeting & Counseling prospective students(mostly working executives in various Industries)
* Primary and Secondary research
* Verify trainers and business Identity on region bases.
* Analyse Software for trainers
*Connecting with trainers and learner’s over the phone or mail.
Job Responsibilities:
Generating revenues by achieving individual targets.
Presenting Company's offer to the potential client
Handle Objections.
Converting suspects into prospects and closing deals.
Daily monitoring of Individual as well as a team activity
Maintaining daily reports.
Job Description:
We are looking for a Big Data Engineer who have worked across the entire ETL stack. Someone who has ingested data in a batch and live stream format, transformed large volumes of daily and built Data-warehouse to store the transformed data and has integrated different visualization dashboards and applications with the data stores. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
Responsibilities:
- Develop, test, and implement data solutions based on functional / non-functional business requirements.
- You would be required to code in Scala and PySpark daily on Cloud as well as on-prem infrastructure
- Build Data Models to store the data in a most optimized manner
- Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Implementing the ETL process and optimal data pipeline architecture
- Monitoring performance and advising any necessary infrastructure changes.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Proactively identify potential production issues and recommend and implement solutions
- Must be able to write quality code and build secure, highly available systems.
- Create design documents that describe the functionality, capacity, architecture, and process.
- Review peer-codes and pipelines before deploying to Production for optimization issues and code standards
Skill Sets:
- Good understanding of optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
- Proficient understanding of distributed computing principles
- Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
- Implemented complex projects dealing with the considerable data size (PB).
- Optimization techniques (performance, scalability, monitoring, etc.)
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Creation of DAGs for data engineering
- Expert at Python /Scala programming, especially for data engineering/ ETL purposes
Quality Assurance Engineer
Notice Period : 45 days / Immediate Joining
We are looking for a Quality Assurance (QA) engineer to develop and execute exploratory and automated tests to ensure product quality. QA engineer responsibilities include designing and implementing tests, debugging, and defining corrective actions. You will also review system requirements and track quality assurance metrics.
The ideal candidate will be instrumental in shaping the product direction and will be actively involved in defining key product features that impact the business. You will work with Principal Engineers to evolve the design and architecture of the products owned by this team. You will be responsible to set up and hold a high software quality bar besides providing technical direction to a highly technical team of Software Engineers. Ultimately, you should monitor all stages of software development to identify and resolve system malfunctions to meet quality standards.
Responsibilities
· Review requirements, specifications, and technical design documents to provide timely and meaningful feedback.
· Create detailed, comprehensive, and well-structured test plans and test cases.
· Estimate, prioritize, plan and coordinate testing activities.
· Design, develop and execute automation scripts using open source tools.
· Perform thorough regression testing when bugs are resolved.
· Develop and apply testing processes for new and existing products to meet client needs.
· Monitor debugging process results.
· Investigate the causes of non-conforming software and train users to implement solutions.
· Track quality assurance metrics, like defect densities and open defect counts.
· Stay up-to-date with new testing tools and test strategies.
Requirements
· 3+ years of experience as part of a QA Support and Services team.
· Experience working on building test plans, test cases, solving problems to improve quality and speed of delivery, and Knowledge of QA methodology and tools
· Experience in coding/scripting and user-level automation.
· Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.
· Hands-on experience with Web application frontend testing, API Testing, Mobile application.
· Experience working in Framework like Robot, Appium, Selenium, cypress
· Strong knowledge of software QA methodologies, Agile/Scrum development process tools, and processes in GIT/Bitbucket, JIRA, confluence.
· Experience in writing clear, concise, and comprehensive test plans and test cases.
· Experience with performance and/or security testing is a plus.
· Good knowledge of Docker, Container applications with production experience.
· Critical thinker and problem-solving skills, Team player.
· Good time-management skills, Ambitious individuals who can work under their own direction towards agreed targets/goals.
· Must be flexible to work on the office timings to accommodate the multi-national client timings.
· Involve development operations & support internal teams.
http://www.banyandata.com" target="_blank">www.banyandata.com
- Conduct market research to identify selling possibilities and evaluate customer needs
- Actively seek out new sales opportunities
- Set up meetings with potential clients and listen to their concerns
- Create frequent reviews and reports with sales and financial data
- Ensure the availability of stock for sales and demonstrations
- Participate on behalf of the company in exhibitions or conferences
- Negotiate/close deals and handle complaints or objections
- Collaborate with team members to achieve better results
- Gather feedback from customers or prospects and share it with internal teams
Requirements
- Proven experience as a Sales Executive or relevant role
- Proficiency in English
- Excellent knowledge of MS Office
- Thorough understanding of marketing and negotiating techniques
- Fast learner and passion for sales
- Self-motivated with a results-driven approach
- Aptitude in delivering attractive presentations
- Graduate from any stream with relevant experience is preferred











