11+ LogRhythm Jobs in Pune | LogRhythm Job openings in Pune
Apply to 11+ LogRhythm Jobs in Pune on CutShort.io. Explore the latest LogRhythm Job opportunities across top companies like Google, Amazon & Adobe.
Job Title: L2 SIEM Administrator - LogRhythm
Location:
Pune – Customer Site (Magarpatta)
Job Summary:
We are seeking an experienced and proactive L2 SIEM Administrator with expertise in LogRhythm to manage, maintain, and optimize our Security Information and Event Management (SIEM) infrastructure.
The ideal candidate will develop use case frameworks, implement SIEM rules, and ensure efficient log management and threat detection.
Key Responsibilities:
LogRhythm Administration:
Manage and maintain the LogRhythm SIEM platform for optimal performance.
Develop, implement, and fine-tune use case frameworks and detection rules to enhance threat detection.
Incident Analysis:
Investigate security alerts and logs to identify and respond to threats.
Escalate unresolved issues to higher-level teams or external stakeholders.
Log Management:
Onboard and configure log sources, ensuring accurate data ingestion and normalization.
Validate log integrity across network and endpoint sources.
Optimization and Troubleshooting:
Resolve technical issues and optimize system performance.
Monitor and maintain dashboards and reporting tools for actionable insights.
Qualifications:
Proven expertise with LogRhythm, including creating and managing use case frameworks and detection rules.
3+ years of experience in SIEM administration.
Strong understanding of security logs, event correlation, and incident analysis.
Familiarity with scripting (Python, PowerShell) and security frameworks (e.g., MITRE ATT&CK).
Relevant certifications (e.g., LogRhythm Certified Professional (LRCP)) are a plus.
Job Description :
Job Title : Data Engineer
Location : Pune (Hybrid Work Model)
Experience Required : 4 to 8 Years
Role Overview :
We are seeking talented and driven Data Engineers to join our team in Pune. The ideal candidate will have a strong background in data engineering with expertise in Python, PySpark, and SQL. You will be responsible for designing, building, and maintaining scalable data pipelines and systems that empower our business intelligence and analytics initiatives.
Key Responsibilities:
- Develop, optimize, and maintain ETL pipelines and data workflows.
- Design and implement scalable data solutions using Python, PySpark, and SQL.
- Collaborate with cross-functional teams to gather and analyze data requirements.
- Ensure data quality, integrity, and security throughout the data lifecycle.
- Monitor and troubleshoot data pipelines to ensure reliability and performance.
- Work on hybrid data environments involving on-premise and cloud-based systems.
- Assist in the deployment and maintenance of big data solutions.
Required Skills and Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or related field.
- 4 to 8 Years of experience in Data Engineering or related roles.
- Proficiency in Python and PySpark for data processing and analysis.
- Strong SQL skills with experience in writing complex queries and optimizing performance.
- Familiarity with data pipeline tools and frameworks.
- Knowledge of cloud platforms such as AWS, Azure, or GCP is a plus.
- Excellent problem-solving and analytical skills.
- Strong communication and teamwork abilities.
Preferred Qualifications:
- Experience with big data technologies like Hadoop, Hive, or Spark.
- Familiarity with data visualization tools and techniques.
- Knowledge of CI/CD pipelines and DevOps practices in a data engineering context.
Work Model:
- This position follows a hybrid work model, with candidates expected to work from the Pune office as per business needs.
Why Join Us?
- Opportunity to work with cutting-edge technologies.
- Collaborative and innovative work environment.
- Competitive compensation and benefits.
- Clear career progression and growth opportunities.
oC#,
oASP.NET,
oSOLID principles,
oDDD,
oUnit/integration testing,
oReact / typescript,
oSQL,
oAgile SCRUM
oOracle, SQL Server, Snowflake
∙Additional expectations
oAWS,
oDocker
oterraform,
oCoding with Copilot,
oGMP or other strict compliance projects,
oCode reviewing,
oPair programming,
oInterest in Functional Programing
Issuing domestic, International flight tickets
Co-ordinating with the airlines
Preferred knowledge of IATA, GDS, Amadeus, Abacus Sabre
HeadQuarters - Bangalore - 1st floor, Lakshmi Chambers, Hsr Layout, Sector 6
HR POC - Mounika Purama
Work Mode - Work From Office
6 Working days (Sat & Sun - Field Work) (Monday - Week Off)
Interview Mode - F2F
Interview Timings - 10:30AM to 6:00PM (Between Tuesday to sunday)
Available Locations - North Bangalore, South East Bangalore, Hyderabad, Pune, Bhubneswar, Mumbai
Salary Negotiable
Our Client is B2B SaaS Product Co. in the space of HR Technology. They are helping organisations to take informed decisions in the areas like Hiring, Training and Career Succession processes. The company was formed in 2010 and since has become a market leader in HR technology space. The founders are alumni of Stanford University and their employees have experience in working with PWC, McKinsey and other similar leagues of organisations.With a bright vision of the founders, the organisation is in an expansion mode to capture niche markets and become a global leader in this domain.
- Experience in Back-End development using Ruby on Rails or NodeJS
- Experience in working on at least two of MongoDB / Postgres / MySQL & Redis
- Experience on MVC patterns using frameworks like Rails, ExpressJS
- Strong understanding of RESTful APIs and HTTP protocol
- Understanding Security aspects of the applications and can successfully implement OWASP compliant systems
- Strong understanding of Linux OS, File Systems, Firewalls etc
- 3 years Experience in Ruby on Rails
- Minimum 3 years in MongoDB / PostgreSQL
- Must be from Product based companies
- Experience of supporting medium/large scale Support / implementation / SAP ECC on HANA .
- Worked with SAP MM, SD, FICO and HCM.Having experience in S4 HANA implementation.
- Absolute understanding of mapping technical designs to Functional Documents (FS) & creating/reviewing corresponding Technical Specs (TS).
- Detailed & exhaustive understanding of coding practices & naming conventions in ABAP.
- Deliver of New Business Request/ Change Request with good quality, the defined timeline and the defined budget Data Dictionary Concepts (Mandatory)Report development classical / Interactive / ALV ,SAP Script / Smart forms /Adobe Forms Module Pool ,SAP BAPI/RFC Performance optimizations
- Proficient in finding correct User Exit / SAP Enhancements Points and SAP BADI.IDOC Configuration, extensions, custom IDOC (Mandatory as working knowledge , if not complete hands-on)





