Cutshort logo

11+ ArcSight Jobs in Pune | ArcSight Job openings in Pune

Apply to 11+ ArcSight Jobs in Pune on CutShort.io. Explore the latest ArcSight Job opportunities across top companies like Google, Amazon & Adobe.

icon
confidential

confidential

Agency job
via OutworX Corporation by Priyanka Arora
Pune
3 - 6 yrs
₹4L - ₹7L / yr
Information security
IBM QRadar
Firewall
ArcSight
QRadar
+1 more
  • Information Security as Security Analyst (SOC)
  • Good understanding of security solutions like Anti-virus, DLP, Proxy, Firewall filtering/monitoring, IPS, Email Security, EPO, WAF etc.
  • Hands on experience withIBM QRadar, ArcSight SIEM tool for logs monitoring and analysis,Service now ticketing tool.
  • Good knowledge on networking concepts including OSI layers,subnet, TCP/IP, ports, DNS, DHCP, firewall monitoring, content filtering, check point etc.

 

Read more
Tata Consultancy Services
Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Delhi
5 - 10 yrs
₹7L - ₹14L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconRedux/Flux
TypeScript

We are actively looking for react and node js role from the top MNC company

Skills : React.js , Node.js , redux, Typescript


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Mumbai, Pune
4 - 8 yrs
Best in industry
Market Research
SQL
Equity derivatives
SoapUI
skill iconPostman
+4 more
  • 4-8 years of experience in Functional testing with good foundation in technical expertise
  • Experience in the Capital Markets domain is MUST
  • Exposure to API testing tools like SoapUI and Postman
  • Well versed with SQL
  • Hands on experience in debugging issues using Unix commands
  • Basic understanding of XML and JSON structures
  • Knowledge of FitNesse is good to have
  • Should be early joinee.


Read more
Client

at Client

2 candid answers
Kunwar Arora
Posted by Kunwar Arora
Bengaluru (Bangalore), Mumbai, Pune, Gandhinagar
7 - 12 yrs
₹1L - ₹21L / yr
SAP PLM
Document Management System
CAD Integration
Object Links
Status Management

We are looking for a SAP PLM DMS Consultant with in-depth knowledge of Document Management System configuration and integration. Candidate must be proficient in DMS object links, document info records, status management, and CAD interface. Experience in managing engineering change processes is preferred.

Read more
Gruve
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore), Pune
5yrs+
Upto ₹50L / yr (Varies
)
skill iconPython
SQL
Data engineering
Apache Spark
PySpark
+6 more

About the Company:

Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.

 

Why Gruve:

At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.

Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.

 

Position summary:

We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. 

Key Roles & Responsibilities:

  • Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
  • Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
  • Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
  • Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
  • Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
  • Implement data governance, security, and compliance best practices.
  • Build and maintain data models, transformations, and data marts for analytics and reporting.
  • Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
  • Automate infrastructure and deployments using Terraform, Airflow, or dbt.
  • Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
  • Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.


Basic Qualifications:

  • Bachelor’s or Master’s Degree in Computer Science or Data Science.
  • 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
  • Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
  • Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
  • Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
  • Proficiency in SQL, Python, or Scala for data transformation and analytics.
  • Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
  • Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
  • Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
  • Strong understanding of data governance, access control, and encryption strategies.
  • Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.


Preferred Qualifications:

  • Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
  • Experience in BI and analytics tools (Tableau, Power BI, Looker).
  • Familiarity with data observability tools (Monte Carlo, Great Expectations).
  • Experience with machine learning feature engineering pipelines in Databricks.
  • Contributions to open-source data engineering projects.
Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Pune
8 - 13 yrs
Best in industry
Data architecture
Systems design
Spark
Apache Kafka
Flink
+5 more

About Data Axle:

 

Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.

Data Axle India is recognized as a Great Place to Work! This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.

 

General Summary:

 

As a Digital Data Management Architect, you will design, implement, and optimize advanced data management systems that support processing billions of digital transactions, ensuring high availability and accuracy. You will leverage your expertise in developing identity graphs, real-time data processing, and API integration to drive insights and enhance user experiences across digital platforms. Your role is crucial in building scalable and secure data architectures that support real-time analytics, identity resolution, and seamless data flows across multiple systems and applications.

 

Roles and Responsibilities:

 

  1. Data Architecture & System Design:
  • Design and implement scalable data architectures capable of processing billions of digital transactions in real-time, ensuring low latency and high availability.
  • Architect data models, workflows, and storage solutions to enable seamless real-time data processing, including stream processing and event-driven architectures.
  1. Identity Graph Development:
  • Lead the development and maintenance of a comprehensive identity graph to unify disparate data sources, enabling accurate identity resolution across channels.
  • Develop algorithms and data matching techniques to enhance identity linking, while maintaining data accuracy and privacy.
  1. Real-Time Data Processing & Analytics:
  • Implement real-time data ingestion, processing, and analytics pipelines to support immediate data availability and actionable insights.
  • Work closely with engineering teams to integrate and optimize real-time data processing frameworks such as Apache Kafka, Apache Flink, or Spark Streaming.
  1. API Development & Integration:
  • Design and develop real-time APIs that facilitate data access and integration across internal and external platforms, focusing on security, scalability, and performance.
  • Collaborate with product and engineering teams to define API specifications, data contracts, and SLAs to meet business and user requirements.
  1. Data Governance & Security:
  • Establish data governance practices to maintain data quality, privacy, and compliance with regulatory standards across all digital transactions and identity graph data.
  • Ensure security protocols and access controls are embedded in all data workflows and API integrations to protect sensitive information.
  1. Collaboration & Stakeholder Engagement:
  • Partner with data engineering, analytics, and product teams to align data architecture with business requirements and strategic goals.
  • Provide technical guidance and mentorship to junior architects and data engineers, promoting best practices and continuous learning.

 

 

Qualifications:

 

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • 10+ years of experience in data architecture, digital data management, or a related field, with a proven track record in managing billion+ transactions.
  • Deep experience with identity resolution techniques and building identity graphs.
  • Strong proficiency in real-time data processing technologies (e.g., Kafka, Flink, Spark) and API development (RESTful and/or GraphQL).
  • In-depth knowledge of database systems (SQL, NoSQL), data warehousing solutions, and cloud-based platforms (AWS, Azure, or GCP).
  • Familiarity with data privacy regulations (e.g., GDPR, CCPA) and data governance best practices.

 

This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Hexaware

Hexaware

Agency job
via pitcs by Seganpalk George
Chennai, Mumbai, Pune
5 - 10 yrs
₹25L - ₹30L / yr
Sitecore
ASP.NET MVC
skill iconHTML/CSS
skill iconJavascript
Microsoft Visual Studio

5+ years of experience in Content Management Systems in general, with at least 3+ yrs. of specific Sitecore experience required. Sitecore 8.X+ experience is ideal.

· Bachelor's in computer science, software engineering or similar

· Hands-on Sitecore development experience.

· The .NET Framework including Visual Studio, C#, and ASP.NET MVC with an understanding of the basics of front end technologies.

· Client-side browser technologies including JavaScript, CSS, and HTML, Responsive Web Design, JQuery and other front end development tools. Experience with Agile methodologies

· Experience with REST, Web Services, Web application frameworks

· Experience with build automation tools

· Experience in TDD and software testing frameworks

· Knowledge of IDEs and source control management (ideally TFS, GIT, SVN or similar)

· Strong adherence to a consistent delivery methodology and associated deliverable generation

Read more
CricStox Private Limited

at CricStox Private Limited

3 recruiters
Ishwar Sharma
Posted by Ishwar Sharma
Pune
2 - 3 yrs
₹7L - ₹9L / yr
skill iconVue.js
skill iconJavascript
skill iconHTML/CSS
CricStox Frontend Engineer @ CricStox
CricStox is a Pune startup building a trading solution in the realm of gametech x fintech.
We intend to build a sport-agnostic platform to allow trading in stocks of sportspersons under any sport
through our mobile & web-based applications.
We’re currently hiring a Frontend Engineer who will gather, refine specifications and requirements based
on technical needs and implement the same by using best software development practices.
Responsibilities?
● Mainly, but not limited to maintaining, expanding, and scaling our microservices/ app/ site.
● Integrate data from various back-end services and databases.
● Always be plugged into emerging technologies/industry trends and apply them into operations and
activities.
● Comfortably work and thrive in a fast-paced environment, learn rapidly and master diverse web
technologies and techniques.
● Juggle multiple tasks within the constraints of timelines and budgets with business acumen.
What skills do I need?
● Excellent programming skills and in-depth knowledge of modern HTML5, CSS3 (including
preprocessors like SASS).
● Excellent programming skills in Javascript or Typescript.
● Basic understanding in Nodejs with Nest framework or equivalent.
● Good programming skills in Vue 3.x with Composition API.
● Good understanding of using CSS frameworks like Quasar, Tailwind, etc.
● A solid understanding of how web applications work including security, session management, and
best development practices.
● Adequate knowledge of database systems, OOPs and web application development.
● Good functional understanding of containerising applications using Docker.
● Basic understanding of how cloud infrastructures like AWS, GCP work.
● Basic understanding of setting up Github CI/CD pipeline to automate Docker images building,
pushing to AWS ECR & deploying to the cluster.
● Proficient understanding of code versioning tools, such as Git (or equivalent).
● Proficient understanding of Agile methodology.
● Hands-on experience with network diagnostics, monitoring and network analytics tools.
● Basic knowledge of Search Engine Optimization process.
● Aggressive problem diagnosis and creative problem solving skills.
Read more
Bajaj Finserv Health

at Bajaj Finserv Health

1 recruiter
Divybhan Sisodia
Posted by Divybhan Sisodia
Pune
5 - 10 yrs
₹18L - ₹35L / yr
skill iconNodeJS (Node.js)
skill iconMongoDB
Mongoose
skill iconExpress
skill iconKubernetes

Key Objectives/Responsibilities of this Role:

  • Understand the UI needs, backend design and data design and create the backend code for the same
  • Backend development in NodeJs 12 & above, ExpressJs, NestJs, RESTful web services, Docker, MongoDB, Azure AKS or Kubernetes
  • Collaboration with other team members (Architect, UI, backend developers, DBA, Data analytics team) to work in coherence
  • Working experience into Agile and scrum methodologies
  • Manage SDEs and Interns to drive the deliverables

 

Mandatory Skillset & Tools: NodeJs 12 and above, ExpressJs, NestJs, Writing Unit Tests , Scheduler, Mongo DB, working knowledge in Kubernetes (AKS), GIT version control, Event driven design, Microservices, con-currency, RESTful web-services, Docker

 

Primary Skill: NodeJs 12 & above ExpressJs, NestJs, Microsoft development platform, Testing & Debugging skills, Postman, CI/CD pipeline, Docker, Docker hub, Writing UNIT tests

 

Secondary Skill: Load & performance testing, testing REST Web services, Basic shell commands, Linux, Good written, verbal and presentation skills, Soft spoken, Pro-active

Read more
Payas technologies pvt ltd
Sakshi Verma
Posted by Sakshi Verma
Pune
0 - 0 yrs
₹3000 - ₹4000 / mo
Human Resources (HR)
HR analytics
Recruitment
Communication Skills

Internship Duties

As with any internship there are a mixture of duties, this internship will include some admin tasks as well as some great HR experience during your internship in a busy HR department.

 

Duration : 

4 months Internship part time Full time


ROLES AND RESPONSIBILITIES

    • Organize and maintain personnel records
    • Update internal databases (e.g. record sick or maternity leave)
    • Prepare HR documents, like employment contracts and  new hire guides
    • Revise company policies 
    • Liaise with external partners, like insurance vendors, and ensure legal compliance
    • Create regular reports and presentations on HR metrics (e.g. turnover ratios)
    • Answer employees queries about HR-related issues
    • Assist payroll department by providing relevant employee information (e.g. leaves of absence, sick days and work schedules)
  • Participate in HR projects (e.g. help organize a job fair event)
  • Gathering payroll data like working hours, leaves and bank accounts
  • Screening resumes and application forms
  • Assist in end to end recruitment
  • Record all resumes and personnel documents in HR electronic files
  • File physical HR records in HR filing cabinet
  • Liaise with recruitment agencies
  • Proof reading and editing consultant profiles
  • Liaise with line managers recording HR issues
  • Scanning and emailing HR documents
  • Create Employee packs
  • Coordinate Induction appointments
  • Create interview appointments
  • Creating job advertisements
  • Phone screen applicants
  • Data Entry and general administration
  • Any extra duties the HR Consultant requires
  • Complete appropriate paperwork for new and existing employees
  • Send off training invoices to finance and liaise with them about getting it approved

 

  • Provide administration support
  • Corporate Tie ups via email, Linkedin, social media and calling for Placements 
  • Understanding the product platform and its value add for related businesses
  • Preparing monthly/quarterly/annualreports
  • Perform other activities assigned by Management. 
  • A flair for establishing an instant rapport with clients.
  • Dynamic, aggressive, result oriented and a self-starter.
  • Knowledge of MS office and related tools

 

 

Required Experience, Skills and Qualifications

Qualification: MBA-HR

Stipend-3000

Skills- Interpersonal skills, Communication skills, convincing skills.

Read more
Zymr

at Zymr

1 recruiter
Amit Patel
Posted by Amit Patel
Bengaluru (Bangalore), Pune
5 - 10 yrs
₹10L - ₹15L / yr
skill iconRuby on Rails (ROR)
skill iconRuby
skill iconJavascript
skill iconReact.js

You’ll be joining the engineering team responsible for delivering SaaS and on-premise solutions that orchestrate case data workflows and provide data driven legal insights for our clients. You will be leading the development, maintenance and scaling of our strategic initiatives - using a variety of distributed systems and cloud-technology stacks, driving our next level of growth. As an early member of our small team, you will have ample opportunity to lead and scale the product and the team. 

Responsibilities

  • Work with product management to understand customers' case data workflows to translate their pain points to requirements.
  • Gain a solid understanding of customer backend pipelines and integration surfaces.
  • Breakdown large projects into potentially shippable incremental units within a timeframe.
  • Understand the current system architecture and design sub systems.
  • Be hands-on and build end-to-end new features and support existing features.
  • Engage in all stages of the product development life cycle.
  • Adopt a customer-centric and iterative mindset.
  • Promote engineering best practices such as API design standards, coding standards, code test coverage, code reviews, CI/CD, and documentation.
  • Work with lead engineers and mentor junior engineers to deliver on large initiatives.
  • Our tech stack is varied - you will start with Ruby/Rails, JavaScript/React, Postgres, Redis, Sidekiq, Heroku, AWS, and more.

Required Skills: 

  • Exposure to a variety of stacks building SaaS platforms and/or on-premise solutions.
  • Experience with building sub-systems or services in a distributed system.
  • Exposure to one or more backend architectures such as event-driven, micro services, and cloud architectures.
  • Experience troubleshooting performance and/or scale issues.
  • Strong in RDBMS and/or NoSQL databases.
  • Proficient in one or more scripting languages - experience in Ruby/Rails and JavaScript/React is a plus.
  • Strong in data structures, algorithms, object-oriented programming, and design patterns.
  • Ability to effectively communicate and collaborate with both technical and nontechnical team members.
  • Prior experience in the legal tech industry is a plus.

Years of Experience : 5+ years

Educational Qualification: BE or BCA or MCA OR Msc IT

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort