UI/UX Developer
- Come up with UI and UX strategies based on our target goals.
- Create and maintain digital assets, such as interface design files, wireframes, and interactive mockups using ( design and prototyping tools: e.g., Sketch and InVision).
- Design, build and maintain highly reusable JavaScript, HTML, and CSS code.
Exp : 0-5 yrs
Salary: 7k to 50k/m
Location: Bhopal
Work from office

About NewRise Technosys PvtLtd
About
Connect with the team
Similar jobs
- 4+ years of professional experience in Cloud FinOps, IT Financial Management, or Cloud Cost Governance within an IT organization.
- 6-8 years of overall experience in Cloud Infrastructure Management, DevOps, Software Development, or related technical roles with hands-on cloud platform expertise
- Hands-on experience with native cloud cost management tools (e.g., AWS Cost Explorer, Azure Cost Management, OCI Cost Analysis) and/or third-party FinOps platforms (e.g., Cloudability, CloudHealth, Apptio).
- Proven experience working within the FinOps domain in a large enterprise environment.
- Strong background in building and managing custom reports, dashboards, and financial insights.
- Deep understanding of cloud financial management practices, including chargeback/showback models, cost savings and avoidance tracking, variance analysis, and financial forecasting.
- Solid knowledge of cloud provider pricing models, billing structures, and optimization strategies.
- Practical experience with cloud optimization and governance practices such as anomaly detection, capacity planning, rightsizing, tagging strategies, and storage lifecycle policies.
- Skilled in leveraging automation to drive operational efficiency in cloud cost management processes.
- Strong analytical and data storytelling skills, with the ability to collect, interpret, and present complex financial and technical data to diverse audiences.
- Experience developing KPIs, scorecards, and metrics aligned with business goals and industry benchmarks.
- Ability to influence and drive change management initiatives that increase adoption and maturity of FinOps practices.
- Highly results-driven, detail-oriented, and goal-focused, with a passion for continuous improvement.
- Strong communicator and collaborative team player with a passion for mentoring and educating others.
- Strong proficiency in Python and SQL for data analysis, automation, and tool development, with demonstrated experience building production-grade scripts and applications.
- Hands-on development experience building automation solutions, APIs, or internal tools for cloud management or financial operations.
- Practical experience with GenAI technologies including prompt engineering, and integrating LLM APIs (OpenAI, Claude, Bedrock) into business workflows.
- Experience with Infrastructure as Code (Terraform etc.) and CI/CD pipelines for deploying FinOps automation and tooling.
- Familiarity with data visualization libraries (e.g. PowerBI ) and building interactive dashboards programmatically.
- Knowledge of ML/AI frameworks is a plus.
- Experience building chatbots or conversational AI interfaces for internal tooling is a plus.
- FinOps Certified Practitioner.
- AWS, Azure, or OCI cloud certifications are preferred
About Astra:
Astra is a cyber security SaaS company that makes otherwise chaotic pentests a breeze with its one of a kind Pentest Platform. Astra's continuous vulnerability scanner emulates hacker behavior to scan applications for 9300+ security tests. CTOs & CISOs love Astra because it helps them fix vulnerabilities in record time and move from DevOps to DevSecOps with Astra's CI/CD integrations.
Astra is loved by 650+ companies across the globe. In 2023 Astra uncovered 2 million+ vulnerabilities for its customers, saving customers $69M+ in potential losses due to security vulnerabilities.
We've been awarded by the President of France Mr. François Hollande at the La French Tech program and Prime Minister of India Shri Narendra Modi at the Global Conference on Cyber Security. Loom, MamaEarth, Muthoot Finance, Canara Robeco, ScripBox etc. are a few of Astra’s customers.
Role Overview:
As a back-end engineer you will be responsible for developing and maintaining the platform/dashboard backend. This would involve developing & maintaining RESTful services for vulnerability management, scan orchestration, inventory management and platform features such as on-boarding, trust centers, certificates, payment integrations, vulnerability ingestion, authentication etc.
You should have a strong background in backend programming (Symfony preferred, or Laravel) and have experience with an event driven, async & distributed architecture.
During the first 6 months of your role, you will be involved in the 0->1 journey of two of our upcoming products in our platform along with the respective product owners.
Love solving hard problems? Want to build high impact products rooted in first principles? Is coding your poetry? Join us in shaping the future of cyber security.
Roles & Responsibilities
- Design, develop, and maintain backend services and APIs using Symfony PHP framework. What you create is also what you own.
- Collaborate with front-end developers to integrate user-facing elements with server-side logic.
- Collaborate with scanner teams to orchestrate scans, ingest vulnerabilities, configure scanners etc.
- Write clean, well-documented, and efficient code.
- Optimize and refactor existing code to improve performance and reliability.
- Implement security and data protection measures.
- Triage, troubleshoot and upgrade existing systems.
- Ship code to production multiple times a day/week.
- Ensure timely delivery of the features.
- Test your own features, and write test cases for continuous automated testing.
- Participate in code reviews and contribute to best practices and standards.
What we are looking for:
- 3 years experience in a similar role or similar working experience.
- Strong background in PHP backend programming. Symfony preferred, or Laravel.
- Strong understanding of software architecture principles and design patterns.
- Experience with an event driven (subscribers/listeners), async (MessageHandlers & job queues) & distributed (load-balanced) architecture.
- Experience with PostgreSQL database and data modeling.
- Understanding of concepts such as Dependency Injection, ORM, data validation, error handling etc.
- Experience with Docker, Kubernetes, GitHub Actions (good to have).
- Experience with Agile methodologies.
- Excellent problem-solving skills and the ability to think strategically about technical solutions.
- Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional remote teams.
- Demonstrated track record of delivering high-quality software products on schedule.
- Knowledge of industry best practices in software development, security, and compliance.
- Knowledge of application & infrastructure security - helps you stand out.
What We offer:
- Adrenalin rush of being a part of a fast-growing company, and working on hard problems that matter.
- Fully remote, agile working environment.
- Good engineering culture with full ownership in design, development, release lifecycle.
- A wholesome opportunity where you get to build things from scratch, improve and ship code to production in hours, not weeks.
- Holistic understanding of SaaS and enterprise security business.
- Experience with the security side of things.
- Annual trips to beaches or mountains (next one is to Wayanad).
- Open and supportive culture.
- Health insurance & other benefits for you and your spouse. Maternity benefits included.
About Us:
Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place.
Location: Gurgaon (On-site)
Experience Level: Entry to Early Career (Freshers welcome!)
Shift Options: Domestic | Middle East | International
Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required
Target Joiners: Any (Bachelor’s or Master’s)
🔥 What You'll Be Owning (Your Impact):
- Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more.
- Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment.
- Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy.
- Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine.
- Client Success: Ensure a smooth onboarding experience and transition for every new learner.
- Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game.
💡 Why Join Sales at Planet Spark?
- Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session.
- High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle.
- Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths.
- Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs.
- Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins.
- Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today.
🎯 What You Bring to the Table:
- Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats.
- Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence.
- Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them.
- Goal-Oriented: You’re self-driven, proactive, and hungry for results.
- Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools.
✨ What’s in It for You?
- 💼 High-growth sales career with serious earning potential
- 🌱 Continuous upskilling in EdTech, sales, and communication
- 🧘 Supportive culture that values growth and well-being
- 🎯 Opportunity to work at the cutting edge of education innovation
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Job Title: Senior Product Delivery Manager
Job Location: Mumbai
About Qrata
We match top talent with highly competitive career opportunities with leading startups in India and globally and address hiring challenges, recruitment brand marketing, with innovative and collaborative strategic consulting.
About the company we are hiring for
It is a global technology firm that offers a SaaS based Delivery Automation Platform. The software helps brands across Food & Beverage, Courier, Express and Parcel, eCommerce & Retail and Transportation (3PLs, 4PLs, etc.) to digitize, optimize and automate deliveries across the supply chain.
It is Growing at an average rate of 120% YoY, it has 200+ enterprise clients in 50+ countries with headquarters in New Jersey, USA and regional offices in Mumbai, Jakarta, Delhi, and Dubai. They are backed with $49.5 million across three rounds of private equity investments by Tiger Global Management, Stead view Capital and Alibaba Group of companies.
They are incredibly excited about the SaaS ecosystem as well and proud to be one of the frontrunners when it comes to building a SaaS platform for the logistics industry. With some of the biggest brands like KFC, Starbucks, McDonald’s, Singapore Post and Decathlon as their clients, their platform has matured tremendously and powers millions of shipments across the globe every day.
Their Culture
Most of their workforce is in Mumbai and there are a bunch of people interested in technology and working at the forefront of innovation in the logistics automation industry. With smaller teams distributed across the globe, the entire team gets together during their annual workcation. The vision of building a global enterprise company and going IPO is what drives us to achieve more, every day!
They endeavour to change the world through continuous innovation and optimization in the space of logistics and supply chain management. Join them as they create the next version of an impactful product which aims to revolutionize business processes and user behaviour through unprecedented forms of disruption
We are currently scouting for Senior Product Delivery Manager
As they experience exponential growth, they are looking for individuals having experience in implementing SaaS solutions for large, global enterprises. The ideal candidate will be an innovative self-starter comfortable working in a start-up environment and adept at managing client expectations. You will be the face of company to C-level executives and senior leaders of our enterprise clients.
You will develop the strategy behind implementations, and bring an obsession for exceeding customer expectations by delivering in accordance with project success metrics. You will also build long-term relationships with the clients, connecting with key business executives and stakeholders. You will liaise between clients, cross-functional internal teams and other system integrators to ensure the timely and successful delivery of our solutions according to customer needs
Responsibilities
- Work with Business Development, Product Management and Engineering team to build highly scalable SaaS products and integration modules
- Connect with clients to determine their technical and functional requirements and recommend appropriate solutions to ensure successful product implementation
- Prepare detailed project plans that blend client requirements well with the company’s visions and co-ordinate across various teams to establish smooth execution during all phases of project
- Adhere to the standards of scope, timeline, and budget throughout the entire execution of project lifecycle and ensure success of the project
- Work closely with development team to explain the requirements and constantly monitor the progress
- Be hands-on, adopt practical approach to software and technology
- Suggest improvements in systems & processes and assist technical team with issues needing technical expertise
- Create technical content such as blogs, technical specification documents and system integration requirements documents
Requirements
- Bachelor’s Degree in Computer Science, Information Technology, or related field
- 8 to 10 years of experience in project and delivery management for Enterprise SaaS companies
- Solid technical background with minimum 2-years of hands-on experience in software development preferably in cloud technologies
- Proven ability to drive large scale global projects with deep understanding of agile SDLC, high collaboration and leadership
- Strong working knowledge of Microsoft Office
- Excellent written and oral communication skills, judgement, and decision- making skills, and the ability to work under continual deadline pressure
- Willingness to travel occasionally within India and internationally
Must Haves:
· Bachelor's Degree in Computer Science, Information Technology, or related field
· Experience in project and delivery management for Enterprise SaaS companies
· Solid technical background with minimum 2-years of hands-on experience in software development preferably in cloud technologies
· Proven ability to drive large scale global projects with deep understanding of agile SDLC, high collaboration and leadership
· Strong working knowledge of Microsoft Office Excellent written and oral communication skills, judgement, and decision- making skills, and the ability to work under continual deadline pressure
Willingness to travel occasionally within India and internationally
As AWS Data Engineer, you are a full-stack data engineer that loves solving business problems. You work with business leads, analysts, and data scientists to understand the business domain and engage with fellow engineers to build data products that empower better decision-making. You are passionate about the data quality of our business metrics and the flexibility of your solution that scales to respond to broader business questions.
If you love to solve problems using your skills, then come join the Team Mactores. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.
What you will do?
- Write efficient code in - PySpark, Amazon Glue
- Write SQL Queries in - Amazon Athena, Amazon Redshift
- Explore new technologies and learn new techniques to solve business problems creatively
- Collaborate with many teams - engineering and business, to build better data products and services
- Deliver the projects along with the team collaboratively and manage updates to customers on time
What are we looking for?
- 1 to 3 years of experience in Apache Spark, PySpark, Amazon Glue
- 2+ years of experience in writing ETL jobs using pySpark, and SparkSQL
- 2+ years of experience in SQL queries and stored procedures
- Have a deep understanding of all the Dataframe API with all the transformation functions supported by Spark 2.7+
You will be preferred if you have
- Prior experience in working on AWS EMR, Apache Airflow
- Certifications AWS Certified Big Data – Specialty OR Cloudera Certified Big Data Engineer OR Hortonworks Certified Big Data Engineer
- Understanding of DataOps Engineering
Life at Mactores
We care about creating a culture that makes a real difference in the lives of every Mactorian. Our 10 Core Leadership Principles that honor Decision-making, Leadership, Collaboration, and Curiosity drive how we work.
1. Be one step ahead
2. Deliver the best
3. Be bold
4. Pay attention to the detail
5. Enjoy the challenge
6. Be curious and take action
7. Take leadership
8. Own it
9. Deliver value
10. Be collaborative
We would like you to read more details about the work culture on https://mactores.com/careers
The Path to Joining the Mactores Team
At Mactores, our recruitment process is structured around three distinct stages:
Pre-Employment Assessment:
You will be invited to participate in a series of pre-employment evaluations to assess your technical proficiency and suitability for the role.
Managerial Interview: The hiring manager will engage with you in multiple discussions, lasting anywhere from 30 minutes to an hour, to assess your technical skills, hands-on experience, leadership potential, and communication abilities.
HR Discussion: During this 30-minute session, you'll have the opportunity to discuss the offer and next steps with a member of the HR team.
At Mactores, we are committed to providing equal opportunities in all of our employment practices, and we do not discriminate based on race, religion, gender, national origin, age, disability, marital status, military status, genetic information, or any other category protected by federal, state, and local laws. This policy extends to all aspects of the employment relationship, including recruitment, compensation, promotions, transfers, disciplinary action, layoff, training, and social and recreational programs. All employment decisions will be made in compliance with these principles.
- · 3+ Yrs Solid knowledge of frontend development with a minimum of 2 years of work on ReactJS,/ React native
- · Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
- · Strong & thorough understanding of Reactjs and its core principles
- · Experience with popular Reactjs workflows such as Redux
- · Knowledge of isomorphic React is a plus
- · Familiarity with RESTful APIs
- · Familiarity with modern front-end build pipelines and tools
- · High coding standards - understanding of test coverage best practices & test pyramid concept
- · Familiar with Continuous Delivery approach
- · Experience with Distributed and Concurrent Systems, knowing the tradeoffs between stateful/stateless and synchronous/asynchronous architectures
- · Passion about technology and its relationship with product and user experience
- · Ability to work independently - identify problems, create plans and implement solutions to them
What is the role?
This position will play a primary role in planning, implementing, leading and executing all aspects of Quality and Testing. The role requires hands-on QA expertise, ability to take initiative and develop and evolve our QA processes. The right candidate will be able to analyze the systems to determine what to test, the priority of those tests and the best way to test.
Key Responsibilities
- Perform impact/test analysis for the new features, feature-enhancements and bug fixes.
- Conduct test design and test execution for functional/non-functional aspects of features / product
- Hands-on execution for functional/non-functional aspects of features / product
- Automate the tests to ensure repeatability of tests
What are we looking for?
An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.
- 3 to 5 years’ experienced software quality professional.
- Experienced developing and executing test cases, scripts, plans and procedures to support various development methodologies.
- Familiarity with entire software development life cycle and test cycles (Unit, Regression, Functional, Systems, Performance and Volume, User Acceptance)
- Knowledge & experience on java or python, selenium, appium etc to perform the automation of websites and apps on desktop/handheld devices
- Knowledge & experience on tools like JMeter, gatling, locust to perform non-functional tests
- Detailed and effective written communication skills for documenting the features tested and bugs found
- Highly organized, detail oriented, extremely responsive candidate who would like to take up technical challenges
- Self-motivated, works well independently and with others
Whom will you work with?
You will work with a top-notch tech team and report to the engineering head.
What can you look for?
A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain quality on content, interact and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.
We are
A fast-growing SaaS commerce company based in Bangalore with offices in Delhi, Mumbai, SF, Dubai, Singapore and Dublin. We have three products in our portfolio: Plum, Empuls and Compass. Xoxoday works with over 1000 global clients. We help our clients in engaging and motivating their employees, sales teams, channel partners or consumers for better business results.
Way forward
We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.










