Cutshort logo
RMgX logo
UX Designer
at RMgX
UX Designer
RMgX's logo

UX Designer

Supriya Tripathi's profile picture
Posted by Supriya Tripathi
3 - 7 yrs
₹5L - ₹6L / yr
Delhi, Gurugram, Noida
Skills
User Experience (UX) Design
Visual Designing
User Interface (UI) Design
Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description Test Description
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About RMgX

Founded :
2015
Type :
Products & Services
Size :
20-100
Stage :
Bootstrapped

About

A new breed of technology focused innovation firm working on innovative solutions. A firm which is - Agile, Evolutionary, Data Driven
Read more

Connect with the team

Profile picture
Supriya Tripathi

Company social profiles

bloglinkedin

Similar jobs

AI Industry
AI Industry
Agency job
via Peak Hire Solutions by Dhara Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
5 - 17 yrs
₹34L - ₹45L / yr
Dremio

Review Criteria:

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Role & Responsibilities:

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
CLOUDSUFI
Noida
3 - 7 yrs
₹15L - ₹28L / yr
skill iconPython
FastAPI
Authentication
Google Cloud Platform (GCP)
ACL
+1 more

About Us


CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.


Our Values


We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.


Equal Opportunity Statement


CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.


About Role:


The Senior Python Developer will lead the design and implementation of ACL crawler connectors for Workato’s search platform. This role requires deep expertise in building scalable Python services, integrating with various SaaS APIs and designing robust data models. The developer will mentor junior team members and ensure that the solutions meet the technical and performance requirements outlined in the Statement of Work.


Key Responsibilities:


  • Architecture and design: Translate business requirements into technical designs for ACL crawler connectors. Define data models, API interactions and modular components using the Workato SDK.
  • Implementation: Build Python services to authenticate, enumerate domain entities and extract ACL information from OneDrive, ServiceNow, HubSpot and GitHub. Implement incremental sync, pagination, concurrency and caching.
  • Performance optimisation: Profile code, parallelise API calls and utilise asynchronous programming to meet crawl time SLAs. Implement retry logic and error handling for network‑bound operations.
  • Testing and code quality: Develop unit and integration tests, perform code reviews and enforce best practices (type hints, linting). Produce performance reports and documentation.
  • Mentoring and collaboration: Guide junior developers, collaborate with QA, DevOps and product teams, and participate in design reviews and sprint planning.
  • Hypercare support: Provide Level 2/3 support during the initial rollout, troubleshoot issues, implement minor enhancements and deliver knowledge transfer sessions.



Must Have Skills and Experiences:


  • Bachelor’s degree in Computer Science or related field.
  • 3-8 years of Python development experience, including asynchronous programming and API integration.
  • Knowledge of python libraries-pandas,pytest,requests,asyncio
  • Strong understanding of authentication protocols (OAuth 2.0, API keys) and access‑control models.
  • Experience with integration with cloud or SaaS platforms such as Microsoft Graph, ServiceNow REST API, HubSpot API, GitHub API.
  • Proven ability to lead projects and mentor other engineers.
  • Excellent communication skills and ability to produce clear documentation.



Optional/Good to Have Skills and Experiences:


  • Experience with integration with Microsoft Graph API, ServiceNow REST API, HubSpot API, GitHub API.
  • Familiarity with the following libraries, tools and technologies will be advantageous-aiohttp,PyJWT,aiofiles / aiocache
  • Experience with containerisation (Docker), CI/CD pipelines and Workato’s connector SDK is also considered a plus.



Read more
A growing online gaming company
A growing online gaming company
Agency job
via Jobdost by Sathish Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
2 - 6 yrs
₹5L - ₹13L / yr
skill iconReact.js
skill iconAngularJS (1.x)
skill iconVue.js
skill iconJavascript
skill iconHTML/CSS
+1 more

Role Summary:

Front-end Developer who would contribute towards building a highly flexible and scalable front-end by bringing deep core technology expertise.

Job Description:

- Ensure proper offline access using service workers and PWA

- Develop components which are SEO friendly

- Have experience in server-side rendering

- Drive evolution of application performance

- Ensure project scalability by having good project architecture

 

Skill Requirements:

- Good experience in HTML5, CSS3, JS - React

- React native

- Angular - Redux exposure - Have handled service workers and PWA caching and updates - CSS pre-processors knowledge like sass - Knowledge of webpack, parcel, grunt etc.

-Knowledge of pre-processors like SASS/LESS, CSS-Modules.

- Architecting and automating the build process for production, using task runners or scripts (Gulp / Grunt)

-Have an eye for good UI/UX, Progressive Web Apps, Responsive Design

-Interested in writing code, actively experimenting along with learning new things.

 

Individual applying to the role should ideally have the following attributes

-Passionate about Frontend Development and continually follow the platform & innovations

-Strong and innovative approach to problem solving and finding solutions

-Interested in working on fast-paced

- Excellent communicator (written and verbal, formal and informal)

- Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution

- Ability to multi-task under pressure and work independently with minimal supervision.

- Ability to prioritize when under pressure.

Read more
Pune
3 - 5 yrs
₹20L - ₹30L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
CI/CD
+12 more

As an engineer, you will help with the implementation, and launch of many key product features. You will get an opportunity to work on a wide range of technologies (including Spring, AWS Elastic Search, Lambda, ECS, Redis, Spark, Kafka etc.) and apply new technologies for solving problems. You will have an influence on defining product features, drive operational excellence, and spearhead the best practices that enable a quality product. You will get to work with skilled and motivated engineers who are already contributing to building high-scale and high-available systems.

If you are looking for an opportunity to work on leading technologies and would like to build product technology that can cater millions of customers inclined towards providing them the best experience, and relish large ownership and diverse technologies, join our team today!

 

What You'll Do:

  • Creating detailed design, working on development and performing code reviews.
  • Implementing validation and support activities in line with architecture requirements
  • Help the team translate the business requirements into R&D tasks and manage the roadmap of the R&D tasks.
  • Designing, building, and implementation of the product; participating in requirements elicitation, validation of architecture, creation and review of high and low level design, assigning and reviewing tasks for product implementation.
  • Work closely with product managers, UX designers and end users and integrating software components into a fully functional system
  • Ownership of product/feature end-to-end for all phases from the development to the production.
  • Ensuring the developed features are scalable and highly available with no quality concerns.
  • Work closely with senior engineers for refining the and implementation.
  • Management and execution against project plans and delivery commitments.
  • Assist directly and indirectly in the continual hiring and development of technical talent.
  • Create and execute appropriate quality plans, project plans, test strategies and processes for development activities in concert with business and project management efforts.

The ideal candidate is a passionate engineer about delivering experiences that delight customers and creating solutions that are robust. He/she should be able to commit and own the deliveries end-to-end.

 

 

What You'll Need:

 

  • A Bachelor's degree in Computer Science or related technical discipline.
  • 2-3+ years of Software Development experience with proficiency in Java or equivalent object-oriented languages, coupled with design and SOA
  • Fluency with Java, and Spring is good.
  • Experience in JEE applications and frameworks like struts, spring, mybatis, maven, gradle
  • Strong knowledge of Data Structures, Algorithms and CS fundamentals.
  • Experience in at least one shell scripting language, SQL, SQL Server, PostgreSQL and data modeling skills
  • Excellent analytical and reasoning skills
  • Ability to learn new domains and deliver output
  • Hands on Experience with the core AWS services
  • Experience working with CI/CD tools (Jenkins, Spinnaker, Nexus, GitLab, TeamCity, GoCD, etc.)

 

  • Expertise in at least one of the following:

    - Kafka, ZeroMQ, AWS SNS/SQS, or equivalent streaming technology

    - Distributed cache/in memory data grids like Redis, Hazelcast, Ignite, or Memcached

    - Distributed column store databases like Snowflake, Cassandra, or HBase

    - Spark, Flink, Beam, or equivalent streaming data processing frameworks

  • Proficient with writing and reviewing Python and other object-oriented language(s) are a plus
  • Experience building automations and CICD pipelines (integration, testing, deployment)
  • Experience with Kubernetes would be a plus.
  • Good understanding of working with distributed teams using Agile: Scrum, Kanban
  • Strong interpersonal skills as well as excellent written and verbal communication skills

• Attention to detail and quality, and the ability to work well in and across teams

Read more
Cargill Business Services
Paramjit Kaur
Posted by Paramjit Kaur
Bengaluru (Bangalore)
2 - 6 yrs
Best in industry
Apache Kafka
Kerberos
Zookeeper
Terraform
Linux administration

As a Kafka Administrator at Cargill you will work across the full set of data platform technologies spanning on-prem and SAS solutions empowering highly performant modern data centric solutions. Your work will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team who shares your passion for building, configuring, and supporting platforms while sharing, learning and growing together.  


  • Develop and recommend improvements to standard and moderately complex application support processes and procedures. 
  • Review, analyze and prioritize incoming incident tickets and user requests. 
  • Perform programming, configuration, testing and deployment of fixes or updates for application version releases. 
  • Implement security processes to protect data integrity and ensure regulatory compliance. 
  • Keep an open channel of communication with users and respond to standard and moderately complex application support requests and needs. 


MINIMUM QUALIFICATIONS

  • 2-4 year of minimum experience
  • Knowledge of Kafka cluster management, alerting/monitoring, and performance tuning
  • Full ecosystem Kafka administration (kafka, zookeeper, kafka-rest, connect)
  • Experience implementing Kerberos security
  • Preferred:
  • Experience in Linux system administration
  • Authentication plugin experience such as basic, SSL, and Kerberos
  • Production incident support including root cause analysis
  • AWS EC2
  • Terraform
Read more
Orboai
at Orboai
4 recruiters
Neha T
Posted by Neha T
Mumbai, Delhi
4 - 7 yrs
₹8L - ₹22L / yr
OpenCV
Image Processing
Image segmentation
skill iconDeep Learning
skill iconPython
+7 more

Who Are We

 

A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.

 

ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.

 

WHY US

  • Join top AI company
  • Grow with your best companions
  • Continuous pursuit of excellence, equality, respect
  • Competitive compensation and benefits

You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.

 

To learn more about how we work, please check out

https://www.orbo.ai/.

 

Description:

We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.

 

Responsibilities:

  • Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
  • Lead a team of ML engineers in developing an industrial AI product from scratch
  • Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
  • Tune the models to achieve high accuracy rates and minimum latency
  • Deploying developed computer vision models on edge devices after optimization to meet customer requirements

 

 

Requirements:

  • Bachelor’s degree
  • Understanding about depth and breadth of computer vision and deep learning algorithms.
  • 4+ years of industrial experience in computer vision and/or deep learning
  • Experience in taking an AI product from scratch to commercial deployment.
  • Experience in Image enhancement, object detection, image segmentation, image classification algorithms
  • Experience in deployment with OpenVINO, ONNXruntime and TensorRT
  • Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
  • Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
  • Proficient understanding of code versioning tools, such as Git

Our perfect candidate is someone that:

  • is proactive and an independent problem solver
  • is a constant learner. We are a fast growing start-up. We want you to grow with us!
  • is a team player and good communicator

 

What We Offer:

  • You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
  • You will be in charge of what you build and be an integral part of the product development process
  • Technical and financial growth!
Read more
MNC
MNC
Agency job
Remote, Bengaluru (Bangalore)
5 - 8 yrs
₹5L - ₹25L / yr
skill icon.NET
Microsoft Windows Azure
skill iconAngular (2+)
Microservices
ASP.NET MVC
+5 more
Looking for Immediate Joiners to 15 days.
Requirements:
  • At least 2+ years of Handson in developing solutions on Microsoft Azure Platform
  • 5+ years of industry experience designing and developing enterprise scale services & platforms 
  • 5+ years of experience in C# with solid analytical, debugging, and problem-solving skills
  • At least 1.5 year of hands-on experience with React JS/ Typescript /Bootstrap/Angular related frameworks
  • At least 3+ Year Hands on in Relational Database SqlServer/ Azure SQL.
Read more
Evox Systems Pvt Ltd
at Evox Systems Pvt Ltd
4 recruiters
Tejas Chaudhary
Posted by Tejas Chaudhary
Remote, Vadodara, Ahmedabad
2 - 5 yrs
₹3L - ₹6L / yr
Shell Scripting
MySQL
skill iconJavascript
skill iconPHP
Development tasks: - Writing bash scripts - Working with Docker in command line - Configuring various web applications (via browser, DB, config files) - Working with Git "Please share with us some Bash code that reflects your proficiency with shell scripting. The more code you share, the better. The minimum is 10 lines."
Read more
Affairal
at Affairal
1 video
1 recruiter
Govind Balakrishna
Posted by Govind Balakrishna
Bengaluru (Bangalore)
3 - 15 yrs
₹10L - ₹30L / yr
skill iconJavascript
skill iconHTML/CSS
Mobile App Development
skill iconJava
skill iconPHP
+2 more
Hey Everyone, look forward to talk to you. We are a disruptive start up in fashion marketplace segment working on core personalization.Featured@tech crunch, websummit,voted top 100 start ups from the Asia region by Tech.co & Red herring. We are hiring! Team of Ex-flipkart,intel, myntra, intuit, ibm and others. We look forward to see you onboard in this amazing journey forward. Thanks & Regards Govind Founder/CEO @ Affairal
Read more
Symantec
at Symantec
4 recruiters
Sanoop Kannoli
Posted by Sanoop Kannoli
Pune
5 - 10 yrs
₹10L - ₹20L / yr
puppet
chef
hyper-v
nagios
peforce
+3 more
We have an exciting opportunity for a DevOps Engineer to join the team. We are looking for someone to bring not only the hands-on implementation skills needed to deliver and run software services but also fresh ideas to a continuous delivery and Agile environment. You will come from either a strong development background or an operations / infrastructure background with proven DevOps experience supporting Testing and Development teams during planning, development, go-live and release. This person will take a shared responsibility in designing and implementing infrastructure for delivering and running software services.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos