Cutshort logo
Saviance Technologies logo
Power BI Developer
Saviance Technologies's logo

Power BI Developer

Shipra Agrawal's profile picture
Posted by Shipra Agrawal
3 - 5 yrs
₹7L - ₹9L / yr
Delhi, Gurugram, Noida
Skills
PowerBI
power bi
Business Intelligence (BI)
DAX
Data modeling
Data Visualization
SSAS
SQL

 

Job Title: Power BI Developer(Onsite)

Location: Park Centra, Sec 30, Gurgaon

CTC:        8 LPA

Time:       1:00 PM - 10:00 PM

  

Must Have Skills: 

  • Power BI Desktop Software
  • Dax Queries
  • Data modeling
  • Row-level security
  • Visualizations
  • Data Transformations and filtering
  • SSAS and SQL

 

Job description:

 

We are looking for a PBI Analytics Lead responsible for efficient Data Visualization/ DAX Queries and Data Modeling. The candidate will work on creating complex Power BI reports. He will be involved in creating complex M, Dax Queries and working on data modeling, Row-level security, Visualizations, Data Transformations, and filtering. He will be closely working with the client team to provide solutions and suggestions on Power BI.

 

Roles and Responsibilities:

 

  • Accurate, intuitive, and aesthetic Visual Display of Quantitative Information: We generate data, information, and insights through our business, product, brand, research, and talent teams. You would assist in transforming this data into visualizations that represent easy-to-consume visual summaries, dashboards and storyboards. Every graph tells a story.
  • Understanding Data: You would be performing and documenting data analysis, data validation, and data mapping/design. You would be mining large datasets to determine its characteristics and select appropriate visualizations.
  • Project Owner: You would develop, maintain, and manage advanced reporting, analytics, dashboards and other BI solutions, and would be continuously reviewing and improving existing systems and collaborating with teams to integrate new systems. You would also contribute to the overall data analytics strategy by knowledge sharing and mentoring end users.
  • Perform ongoing maintenance & production of Management dashboards, data flows, and automated reporting.
  • Manage upstream and downstream impact of all changes on automated reporting/dashboards
  • Independently apply problem-solving ability to identify meaningful insights to business
  • Identify automation opportunities and work with a wide range of stakeholders to implement the same.
  • The ability and self-confidence to work independently and increase the scope of the service line

 

Requirements: 

  • 3+ years of work experience as an Analytics Lead / Senior Analyst / Sr. PBI Developer.
  • Sound understanding and knowledge of PBI Visualization and Data Modeling with DAX queries
  • Experience in leading and mentoring a small team.

 

 

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About Saviance Technologies

Founded :
1999
Type :
Products & Services
Size :
100-1000
Stage :
Raised funding

About

Saviance Technologies is a leader in Digital Health and Healthcare IT Solutions helping clients across the US, Europe, and India with their Digital Transformation journey. Saviance has a firm footprint in addressing the unique challenges of the US Pharmaceutical & Healthcare industry, especially in the area of Patient Engagement. By providing cost-effective healthcare IT solutions built on a secure model for integrating disparate systems, we are changing the nature and speed of interaction between consumers, physicians, insurance companies, healthcare organizations, and other stakeholders.The Saviance Leadership Team has a combined industry experience of over 150 years at IBM, Fujitsu, Wipro, and Sapient, and includes alumni from Harvard and Wharton. The company is a Tier 1 Diverse Supplier certified at the national level by the National Minority Supplier Development Council.
Read more

Connect with the team

Profile picture
Shipra Agrawal

Company social profiles

twitterfacebook

Similar jobs

Global digital transformation solutions provider.
Global digital transformation solutions provider.
Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
7 - 9 yrs
₹15L - ₹28L / yr
databricks
skill iconPython
SQL
PySpark
skill iconAmazon Web Services (AWS)
+9 more

Role Proficiency:

This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.


Skill Examples:

  1. Proficiency in SQL Python or other programming languages used for data manipulation.
  2. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  3. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  4. Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  5. Experience in performance tuning.
  6. Experience in data warehouse design and cost improvements.
  7. Apply and optimize data models for efficient storage retrieval and processing of large datasets.
  8. Communicate and explain design/development aspects to customers.
  9. Estimate time and resource requirements for developing/debugging features/components.
  10. Participate in RFP responses and solutioning.
  11. Mentor team members and guide them in relevant upskilling and certification.

 

Knowledge Examples:

  1. Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
  2. Proficient in SQL for analytics and windowing functions.
  3. Understanding of data schemas and models.
  4. Familiarity with domain-related data.
  5. Knowledge of data warehouse optimization techniques.
  6. Understanding of data security concepts.
  7. Awareness of patterns frameworks and automation practices.


 

Additional Comments:

# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026

Project Overview:

Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.

The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.

Design, build, and maintain scalable data pipelines using Databricks and PySpark.

Develop and optimize complex SQL queries for data extraction, transformation, and analysis.

Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).

Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.

Ensure data quality, performance, and reliability across data workflows.

Participate in code reviews, data architecture discussions, and performance optimization initiatives.

Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.


Key Skills:

Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).

Excellent problem-solving, communication, and collaboration skills.

 

Skills: Databricks, Pyspark & Python, Sql, Aws Services

 

Must-Haves

Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)

Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).


******

Notice period - Immediate to 15 days

Location: Bangalore

Read more
Classplus
at Classplus
1 video
4 recruiters
Peoples Office
Posted by Peoples Office
Noida
8 - 10 yrs
₹35L - ₹55L / yr
skill iconDocker
skill iconKubernetes
DevOps
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
+16 more

About us

 

Classplus is India's largest B2B ed-tech start-up, enabling 1 Lac+ educators and content creators to create their digital identity with their own branded apps. Starting in 2018, we have grown more than 10x in the last year, into India's fastest-growing video learning platform.

 

Over the years, marquee investors like Tiger Global, Surge, GSV Ventures, Blume, Falcon, Capital, RTP Global, and Chimera Ventures have supported our vision. Thanks to our awesome and dedicated team, we achieved a major milestone in March this year when we secured a “Series-D” funding.

 

Now as we go global, we are super excited to have new folks on board who can take the rocketship higher🚀. Do you think you have what it takes to help us achieve this? Find Out Below!

 

 

What will you do?

 

· Define the overall process, which includes building a team for DevOps activities and ensuring that infrastructure changes are reviewed from an architecture and security perspective

 

· Create standardized tooling and templates for development teams to create CI/CD pipelines

 

· Ensure infrastructure is created and maintained using terraform

 

· Work with various stakeholders to design and implement infrastructure changes to support new feature sets in various product lines.

 

· Maintain transparency and clear visibility of costs associated with various product verticals, environments and work with stakeholders to plan for optimization and implementation

 

· Spearhead continuous experimenting and innovating initiatives to optimize the infrastructure in terms of uptime, availability, latency and costs

 

 

You should apply, if you

 

 

1. Are a seasoned Veteran: Have managed infrastructure at scale running web apps, microservices, and data pipelines using tools and languages like JavaScript(NodeJS), Go, Python, Java, Erlang, Elixir, C++ or Ruby (experience in any one of them is enough)

 

2. Are a Mr. Perfectionist: You have a strong bias for automation and taking the time to think about the right way to solve a problem versus quick fixes or band-aids.

 

3. Bring your A-Game: Have hands-on experience and ability to design/implement infrastructure with GCP services like Compute, Database, Storage, Load Balancers, API Gateway, Service Mesh, Firewalls, Message Brokers, Monitoring, Logging and experience in setting up backups, patching and DR planning

 

4. Are up with the times: Have expertise in one or more cloud platforms (Amazon WebServices or Google Cloud Platform or Microsoft Azure), and have experience in creating and managing infrastructure completely through Terraform kind of tool

 

5. Have it all on your fingertips: Have experience building CI/CD pipeline using Jenkins, Docker for applications majorly running on Kubernetes. Hands-on experience in managing and troubleshooting applications running on K8s

 

6. Have nailed the data storage game: Good knowledge of Relational and NoSQL databases (MySQL,Mongo, BigQuery, Cassandra…)

 

7. Bring that extra zing: Have the ability to program/script is and strong fundamentals in Linux and Networking.

 

8. Know your toys: Have a good understanding of Microservices architecture, Big Data technologies and experience with highly available distributed systems, scaling data store technologies, and creating multi-tenant and self hosted environments, that’s a plus

 

 

Being Part of the Clan

 

At Classplus, you’re not an “employee” but a part of our “Clan”. So, you can forget about being bound by the clock as long as you’re crushing it workwise😎. Add to that some passionate people working with and around you, and what you get is the perfect work vibe you’ve been looking for!

 

It doesn’t matter how long your journey has been or your position in the hierarchy (we don’t do Sirs and Ma’ams); you’ll be heard, appreciated, and rewarded. One can say, we have a special place in our hearts for the Doers! ✊🏼❤️

 

Are you a go-getter with the chops to nail what you do? Then this is the place for you.

Read more
Accolite Digital
Nitesh Parab
Posted by Nitesh Parab
Bengaluru (Bangalore), Hyderabad, Gurugram, Delhi, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹5L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SSIS
SQL Server Integration Services (SSIS)
+10 more

Job Title: Data Engineer

Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.

Responsibilities:

  • Design, build, and maintain data pipelines to collect, store, and process data from various sources.
  • Create and manage data warehousing and data lake solutions.
  • Develop and maintain data processing and data integration tools.
  • Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
  • Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
  • Ensure data quality and integrity across all data sources.
  • Develop and implement best practices for data governance, security, and privacy.
  • Monitor data pipeline performance / Errors and troubleshoot issues as needed.
  • Stay up-to-date with emerging data technologies and best practices.

Requirements:

Bachelor's degree in Computer Science, Information Systems, or a related field.

Experience with ETL tools like Matillion,SSIS,Informatica

Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.

Experience in writing complex SQL queries

Strong programming skills in languages such as Python, Java, or Scala.

Experience with data modeling, data warehousing, and data integration.

Strong problem-solving skills and ability to work independently.

Excellent communication and collaboration skills.

Familiarity with big data technologies such as Hadoop, Spark, or Kafka.

Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks

Familiarity with cloud computing platforms such as AWS, Azure, or GCP.

Familiarity with Reporting tools

Teamwork/ growth contribution

  • Helping the team in taking the Interviews and identifying right candidates
  • Adhering to timelines
  • Intime status communication and upfront communication of any risks
  • Tech, train, share knowledge with peers.
  • Good Communication skills
  • Proven abilities to take initiative and be innovative
  • Analytical mind with a problem-solving aptitude

Good to have :

Master's degree in Computer Science, Information Systems, or a related field.

Experience with NoSQL databases such as MongoDB or Cassandra.

Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.

Knowledge of machine learning and statistical modeling techniques.

If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.

Read more
OYO Rooms
at OYO Rooms
20 recruiters
Shraddha Jhamb
Posted by Shraddha Jhamb
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Hyderabad
4 - 6 yrs
₹5L - ₹20L / yr
Penetration testing
skill iconAmazon Web Services (AWS)
Azure
OSCP
LCEH
+1 more

About The Company -

OYO Hotels & Homes is the world’s third largest and fastest-growing chain of leased and franchised hotels, homes & spaces managing over 1 million exclusive rooms across 800 cities and 80 countries. OYO was founded on the mission that everyone deserves a quality living and working space and we are very passionate about this mission. Technology and Innovation plays a critical role in this mission and therefore today we employ World Class engineers, product managers and designers across core markets & geographies. If you are looking for a high pace environment, itching to create a large impact through technology impacting 100s of millions of customers across the globe, we love to hear from you.

 

Key Responsibilities:

 

  • Conducting application(Web & Mobile) and infrastructure penetration testing assessments.
  • Deploy, improve and utilize SAST/DAST/SCA and other cybersecurity solutions to detect & prevent security vulnerabilities.
  • Work closely with the business, product and Development/engineering teams to provide input and guidance on developing secure products and help teams adopt shift-security-to-left practices.
  • Work closely with the DevOps team to secure the cloud environment.
  • Developing and maintaining cybersecurity process activities including security requirements engineering, threat modelling, code reviews and cyber risk assessment.
  • Improve and automate cybersecurity processes within the CI/CD pipelines.
  • Continuously review and identify security improvement opportunities in existing products, processes, services and workflows to ensure the people, products and technology in the organization are protected against current and future cybersecurity threats.
  • Deliver awareness sessions on Secure Development to engineering/development teams
  • Drive continuous improvement activities to define, measure, visualize and improve key cyber security metrics related to Application Security.
  • Preparing and launching social engineering campaigns;

 

Key Skills:

 

  • Expertise in application(Web & Mobile) and infrastructure penetration testing.
  • Strong experience with Azure or AWS cloud environments and its security controls.
  • Experience with microservices architectures & distributed Platforms
  • Strong experience with using Agile software development and securing CI/CD pipeline.
  • Coding Experience in Scripting & programming languages (such as Terraform, Java, Python, Ruby, etc.)
  • Knowledge of how modern web & mobile apps are designed, developed and deployed across different platforms;
  • Knowledge of common exploitation techniques and mitigations.
  • Experience in implementing and managing a vulnerability management program (process and technology).
  • Experience and knowledge of implementing a DevSecOps ecosystem and strong understanding of Dynamic and Static Application Security Testing (DAST & SAST).
  • Understanding of the main cybersecurity tools (SIEM, IPS, XDR, etc.).
  • Strong understanding of OWASP, PTES and other penetration testing methodologies.
  • Understanding of global security frameworks and standards like NIST, ISO 27001, GDPR, PCI etc.
  • Strong knowledge in preparing and launching social engineering campaigns.
  • Ability to program or script in your preferred language
  • Good understanding of network and OS principles
  • Strong written and spoken English skills and ability to write high-quality reports
  • An Information Security qualification e.g CSSLP, CEH, OSCP, or similar certification

 

Cultural Traits common to all OYO Leaders -

 

● Dealing with Ambiguity and Adaptability – we are a large, but fast-growing company today with not enough existing process or rules of engagements; and environment changes rapidly due to new businesses, geographies and strategic partnerships etc. You need to be able to create organization out of chaos, operate in an environment with minimal structure and adapt to change quickly while maintaining high velocity

● Ownership – anything between you and your job is also your job

● Bias for Action – speed matters a lot, so does quality. Ideal leader will be pragmatic, action-oriented and know the right balance between competing priorities

● Hunger to change the world – you need to be ambitious and willing to do more. If you believe you have already achieved your best and primarily looking to impart that vast knowledge, we aren’t the right place for you

 

Job Locations: We have a Pan India presence with Tech centers based out of Gurugram, Bangalore & Hyderabad. However currently we are working from our home.

 

Read more
Borderless Softech Pvt Ltd
Madri Prasad
Posted by Madri Prasad
Bengaluru (Bangalore)
7 - 12 yrs
₹20L - ₹40L / yr
skill iconNodeJS (Node.js)
skill iconMongoDB
skill iconReact.js
TypeScript
Object Oriented Programming (OOPs)

Role description:

We are looking for a LEAD- Fullstack with expertise and experience in designing and developing applications including new developments, enhancements, maintenance and support. The role involves continuous collaboration with team, partners.

Candidate description:

Should have:

  • Passion for technology and financial domain with demonstrated ability to learn quickly
  • Delivery focus with ability to take full ownership
  • Strong commitment to quality and delivery
  • Strong communication skills

Skills/Knowledge and experience:

  • Object oriented design and development skill
  • Webservices - Node JS/Express JS, Rest API
  • Frontend –, React, Angular (Javascript and Typescript)
  • Unix and Shell scripting
  • SQL Proficiency, good to know MongoDB
Read more
Fintech MNC
Remote only
3 - 5 yrs
₹10L - ₹12L / yr
SDET
Test Automation (QA)
WHAT YOU'LL BE DOING:
● Contribute to quality processes and improvements within the Software Platform Team
● Collaborate with developers and QA engineers on geographically-diverse teams to ensure the
quality implementation of application features, E2E
● Ensure proper test coverage and testability within development sprints
● Participate in QA sprint planning within the Software Platform Team
● Contribute to the development and maintenance of automation tests and frameworks
● Escalate defect reports and critical issues
● Work closely with the technical project managers to prioritize defect resolution

WE'RE LOOKING FOR SOMEONE WITH:
● BS in Computer Science or equivalent
● Minimum of 3 years of experience as a QA Engineer, preferably at an enterprise SaaS
organization, with working experience as a software development engineer in test (SDET)
● Minimum of 2 years of experience testing the Data Layer specifically
● Strong knowledge of a full tech stack and strong debugging skills
● Experience in REST API testing, both automated and manual
● Experience with testing SOA and microservices applications
● Experience in writing scripts or programs to analyze/validate large data sets
● Proficient in QA methodology and process (Agile testing experience preferred)
● Proficient in SQL: query, inserts and updates
● Experience with testing ETLs
● Strong communication skills: must be able communicate with other functional teams to coordinate
systems integration testing requirements
● Ability to convert complex business, technical requirements, use cases, and user stories to test
cases and test plans
● Hands-on experience with defect management tools such as Jira

PLUSES:
● Personal or professional experience with Robo-Advising/FinTech Products is highly desired
● Specific QA experience in a startup environment culture
● Experience with queuing/messaging systems (RabbitMQ, Kafka, etc.)
● Experience with time series or document databases
● Experience in performance engineering/testing of scalable systems
Read more
Huge openings for Quality Production
Huge openings for Quality Production
Agency job
via AA MANPOWER SOLUTION by Janani Janu
Remote, vadapalani, Chennai
0 - 2 yrs
₹1L - ₹2L / yr
Mechanical engineering
Production
Quality control
Plant maintenance

Greetings from “AA Manpower Solutions” {Ref: JANANI} 



AA MANPOWER SOLUTIONS 

We are hiring quality maintenance Engineer for MNC Company.

Job Summary: 
Currently, we are hiring for the position of Production and Quality Maintenance Engineer based at Chennai location. Please refer the beneath job description and walk-in for Top MNC. 

Job Duties and Responsibilities: 

          Observing, learning and understanding a wide range of engineering skills and processes while under the instruction of skilled technical staff. 

         Develop new systems, processes, and equipment to ensure manufacturing is efficient and effective. 

         Investigate and analyse problems and devise solutions that save money, time, and materials. 

         Manage budgets and ensure project deadlines are adequately met according to standards and requirements. 

Desired Profile: 

Designation: GET (Graduate Engineering Trainee) and NEEM, On Role job Production Engineer/Quality Engineer/ Design Engineer /quality maintenance

Qualification: Diploma, B.E/ B.TECH - Mechanical Engineering, EEE, ECE AUTO Engineering 

Experience: 0 - 3yr. 

Salary: INR 1,40,000 – INR 2,80,000 P.A 

Nature of Job: Production or Quality ,maintenance

Job Type: Full-time 

Benefits 

Facility : 

 
  Bus + Canteen Available
  ESI + PF Available 

  8 hours of duty 

  Joining immediate 

Work Location: Chennai 

Interview Date: start to 01th February

Last date to apply: 29th February

Feel free to call us for any clarification [Call between 10 am to 5 pm] 

Interested candidates visit our office

Interview Venue; 

No. 24, F1, First Floor, 

BajanaiKoil 2nd street, 

Vadapalani, 

Chennai-600026 

Landmark: SIMS Hospital Backside 

(Above south Indian Movie Still camera Man Association) 

Regards,
Janani 

Read more
Radical HealthTech
at Radical HealthTech
3 recruiters
Shibjash Dutt
Posted by Shibjash Dutt
NCR (Delhi | Gurgaon | Noida)
2 - 7 yrs
₹5L - ₹15L / yr
skill iconPython
Terraform
skill iconAmazon Web Services (AWS)
Linux/Unix
skill iconDocker
DevOps Engineer


Radical is a platform connecting data, medicine and people -- through machine learning, and usable, performant products. Software has never been the strong suit of the medical industry -- and we are changing that. We believe that the same sophistication and performance that powers our daily needs through millions of consumer applications -- be it your grocery, your food delivery or your movie tickets -- when applied to healthcare, has a massive potential to transform the industry, and positively impact lives of patients and doctors. Radical works with some of the largest hospitals and public health programmes in India, and has a growing footprint both inside the country and abroad.


As a DevOps Engineer at Radical, you will:

Work closely with all stakeholders in the healthcare ecosystem - patients, doctors, paramedics and administrators - to conceptualise and bring to life the ideal set of products that add value to their time
Work alongside Software Developers and ML Engineers to solve problems and assist in architecture design
Work on systems which have an extraordinary emphasis on capturing data that can help build better workflows, algorithms and tools
Work on high performance systems that deal with several million transactions, multi-modal data and large datasets, with a close attention to detail


We’re looking for someone who has:

Familiarity and experience with writing working, well-documented and well-tested scripts, Dockerfiles, Puppet/Ansible/Chef/Terraform scripts.
Proficiency with scripting languages like Python and Bash.
Knowledge of systems deployment and maintainence, including setting up CI/CD and working alongside Software Developers, monitoring logs, dashboards, etc.
Experience integrating with a wide variety of external tools and services
Experience navigating AWS and leveraging appropriate services and technologies rather than DIY solutions (such as hosting an application directly on EC2 vs containerisation, or an Elastic Beanstalk)


It’s not essential, but great if you have:

An established track record of deploying and maintaining systems.
Experience with microservices and decomposition of monolithic architectures
Proficiency in automated tests.
Proficiency with the linux ecosystem
Experience in deploying systems to production on cloud platforms such as AWS


The position is open now, and we are onboarding immediately.


Please write to us with an updated resume, and one thing you would like us to see as part of your application. This one thing can be anything that you think makes you stand apart among candidates.


Radical is based out of Delhi NCR, India, and we look forward to working with you!


We're looking for people who may not know all the answers, but are obsessive about finding them, and take pride in the code that they write. We are more interested in the ability to learn fast, think rigorously and for people who aren’t afraid to challenge assumptions, and take large bets -- only to work hard and prove themselves correct. You're encouraged to apply even if your experience doesn't precisely match the job description. Join us.

Read more
Unacademy
at Unacademy
1 video
11 recruiters
Garima Singh
Posted by Garima Singh
Bengaluru (Bangalore)
2 - 5 yrs
₹4L - ₹7L / yr
Acquisition
Recruitment/Talent Acquisition
Recruitment
Inbound Recruitment
Outbound Recruitment
We are looking for a Talent Acquisition Consultant to develop and implement sourcing and employer branding techniques that will help us recruit talented employees.Talent Acquisition Consultant responsibilities include forecasting hiring needs, sourcing potential hires on various online channels, building relationships with passive candidates and hiring managers and evaluating candidates- performance in interviews and assignments. Responsibilities:- Determine current staffing needs- Source candidates on social networks and niche platforms - Review job applications to identify high-potential candidates- Liaise with hiring managers to understand each position's expectations- Track key recruiting KPIs- Foster long-term relationships with past applicants and potential candidates.Requirements:- Minimum 2 years work experience as a Talent Acquisition Consultant or similar role- Hands-on experience with candidate sourcing and evaluation- Familiarity with job boards, resume databases and Applicant Tracking Systems- Excellent communication and interpersonal abilities- Strong decision-making skills- Highly organized and self-motivated.- Demonstrated ability to prioritize multiple projects simultaneously
Read more
MindTickle
at MindTickle
1 video
11 recruiters
Rohit Chib
Posted by Rohit Chib
Pune
1 - 3 yrs
₹10L - ₹25L / yr
skill iconJava
MySQL
skill iconNodeJS (Node.js)
Job Description We are looking for a rockstar technology evangelist for the engineering team who will be building maintaining & scaling platform at MindTickle with rightly selecting the most appropriate architectur, such that it suits the business needs, and achieves the desired results under given constraints. Strategic Responsibility: Design & Build - Designing and developing high-volume, low-latency applications for mission-critical systems and delivering high-availability and performance Collaborate - Collaborating within your product streams and team to bring best practices and leverage world class tech stack Measurable outcome - You will need to set quantifiable objectives that encapsulate quality attributes of a system. The fitness of the application is measured against set marks. DevOps - You will need to set up every essentials(Tracking/alerting) to make sure the infrastructure/software you built is working as expected Personality: Requires excellent communication skills – written, verbal, and presentation. You should be a team player. You should be positive towards problem solving, have very structural thought process to solve problems. You should be agile enough to learn new technology if needed. Qualifications: BTech / BS / BE / MTech / MS / ME in CS or equivalent from IITs or Top Tier Engineering Colleges 1-3 years of strong software(application or infrastructure) development experience and software engineering skills (Java/Scala,Node, and javascript preferred) Deep expertise and practical knowledge of operating systems, MySQL and NoSQL databases(Redis or couchbase or mongodb or ES or any graphDB) Working knowledge of amazon web services(AWS) Experience with Docker(docker.io) will be a plus Self motivated and team player
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos