About the Role :
We are looking for an experienced Senior Network Engineer with a good background in Ansible to join our team. The ideal candidate will involve in implementation, and management of advanced network infrastructures, focusing on automation and optimization.
Key Responsibilities:
· Managing network infra with 24/7 support environment and provide 99% network up time
· Managing & Troubleshooting of LAN, VLAN, Web filter, IP routing, LACP, Wireless & WAN networks
· Hands on experience on Palo Alto firewalls, Cisco switches, HP Procurve, Aruba Wireless
· Good to have knowledge on F5 load balancer, ClearPass, Aruba Wireless IAP/ Controller based Radius Integration
· Develop, test, and maintain Ansible for network automation.
· Monitor network performance and implement improvements.
· Troubleshoot and resolve advanced network issues.
· Collaborate with cross-functional teams to ensure network reliability and security.
· Mentor junior network engineers and provide technical guidance.
Maintain comprehensive network documentation and ensure compliance with industry standards.
Qualifications:
· Bachelor’s degree in computer science, Information Technology, or a related field.
· Extensive experience as a Network Engineer, with a focus on senior-level responsibilities.
· Knowledge in Ansible for network automation.
· Deep understanding of network protocols (e.g., TCP/IP, BGP, OSPF).
· Experience with network hardware (e.g., routers, switches, firewalls).
· Proficiency in Cisco Meraki configuration and management.
· Expert-level knowledge of Palo Alto firewalls.
· Strong problem-solving and communication skills.
· Relevant certifications (e.g., CCNA, CCNP, CPNSA) are highly desirable.
· Experience with other automation tools (e.g., Terraform, Puppet).
Preferred Skills:
· Experience with cloud networking (e.g., AWS, Azure).
· Knowledge of network security best practices.
· Knowledge in scripting languages (e.g., Python, Bash).
· Experience with network monitoring tools (e.g., Nagios, SolarWinds).
· Understanding of SDWAN (Software-Defined Networking) technologies.
· Knowledge of containerization and orchestration (e.g., Docker, Kubernetes).
· Experience with VPN and remote access solutions.
· Familiarity with ITIL processes and frameworks.

About LogixHealth
About
Company social profiles
Similar jobs
About us
Arka energy is focussed on changing the paradigm on energy. Arka focusses on creating innovative renewable energy solutions for residential customers. With its custom product design and an innovative approach to market the product solution, Arka aims to be a leading provider of energy solutions in the residential solar segment. Arka designs and develops end to end renewable energy solutions with teams in Bangalore and in the Bay area
This product is a 3d simulation software, to replicate rooftops/commercial sites, place solar panels and generate the estimation of solar energy.
What are we looking for?
· As a backend developer you will be responsible for developing solutions that will enable Arka solutions to be easily adopted by customers.
· Attention to detail and willingness to learn is a big part of this position.
· Commitment to problem solving, and innovative design approaches are important.
Role and responsibilities
● Develop cloud-based Python Django software products
● Working closely with UX and Front-end Developers
● Participating in architectural, design and product discussions Designing and creating RESTful APIs for internal and partner consumption
● Working in an agile environment with an excellent team of engineers
● Own/maintain code everything from development to fixing bugs/issues.
● Deliver clean, reusable high-quality code
● Facilitate problem diagnosis and resolution for issues reported by Customers
● Deliver to schedule and timelines based on an Agile/Scrum-based approach
● Develop new features and ideas to make product better and user centric.
● Must be able to independently write code and test major features, as well as work jointly with other team members to deliver complex changes
● Create algorithms from scratch and implement them in the software.
● Code Review, End to End Unit Testing.
● Guiding and monitoring Junior Engineers.
SKILL REQUIREMENTS
● Solid database skills in a relational database (i.e. PostgresSQL, MySQL, etc.) Knowledge of how to build and use with RESTful APIs
● Strong knowledge of version control (i.e. git, svn, etc.)
● Experience deploying Python applications into production
● Azure or Google cloud infrastructure knowledge is a plus
● Strong drive to learn new technologies
● Ability to learn new technologies quickly
● Continuous look-out for new and creative solutions to implement new features or improve old ones
● Data Structures, Algorithms, Django and Python
Good To have
· Knowledge on GenAI Applications.
Key Benefits
· Competitive development environment
· Engagement into full scale systems development
· Competitive Salary
· Flexible working environment
· Equity in an early-stage start-up
· Patent Filing Bonuses
· Health Insurance for Employee + Family
Job Title: Big Data Engineer (Java Spark Developer – JAVA SPARK EXP IS MUST)
Location: Chennai, Hyderabad, Pune, Bangalore (Bengaluru) / NCR Delhi
Client: Premium Tier 1 Company
Payroll: Direct Client
Employment Type: Full time / Perm
Experience: 7+ years
Job Description:
We are looking for a skilled Big Data Engineers using Java Spark with 7+ years of experience in Big Data / legacy platforms, who can join immediately. Desired candidate should have design, development and optimization of real-time & batch data pipelines experience in Big Data environment at an enterprise scale applications. You will work on building scalable and high-performance data processing solutions, integrating real-time data streams, and building a reliable Data platforms. Strong troubleshooting, performance tuning, and collaboration skills are key for this role.
Key Responsibilities:
· Develop data pipelines using Java Spark and Kafka.
· Optimize and maintain real-time data pipelines and messaging systems.
· Collaborate with cross-functional teams to deliver scalable data solutions.
· Troubleshoot and resolve issues in Java Spark and Kafka applications.
Qualifications:
· Experience in Java Spark is must
· Knowledge and hands-on experience using distributed computing, real-time data streaming, and big data technologies
· Strong problem-solving and performance optimization skills
· Looking for immediate joiners
If interested, please share your resume along with the following details
1) Notice Period
2) Current CTC
3) Expected CTC
4) Have Experience in Java Spark - Y / N (this is must)
5) Any offers in hand
Thanks & Regards,
LION & ELEPHANTS CONSULTANCY PVT LTD TEAM
SINGAPORE | INDIA
About Company
Buyume Is an E-commerce platform for Beauty professionals where they can buy Beauty products as well as educate themselves in the Salon and Beauty Industry.
Requirments
This candidate will be responsible for prospecting, qualifying, and generating new sales opportunities. In order to be successful in this role and meet or exceed quota, this candidate should feel comfortable communicating with prospects via phone and email who are discovered through a variety of avenues.
We are looking for a candidate with any of the given below regional language proficiency
(Odia, Gujarati, Assamese, Bengali, Marathi, Malayalam, Telugu, Kannada, Tamil).
Responsibilities
- Research, target and open new client opportunities
- Develop targeted messaging to engage prospect companies and executives
- Qualify prospects by understanding customer needs and budgets
Qualifications
- 10+2
- 2+ years' previous sales experience
- Extrovert
Job Title: AWS-Azure Data Engineer with Snowflake
Location: Bangalore, India
Experience: 4+ years
Budget: 15 to 20 LPA
Notice Period: Immediate joiners or less than 15 days
Job Description:
We are seeking an experienced AWS-Azure Data Engineer with expertise in Snowflake to join our team in Bangalore. As a Data Engineer, you will be responsible for designing, implementing, and maintaining data infrastructure and systems using AWS, Azure, and Snowflake. Your primary focus will be on developing scalable and efficient data pipelines, optimizing data storage and processing, and ensuring the availability and reliability of data for analysis and reporting.
Responsibilities:
- Design, develop, and maintain data pipelines on AWS and Azure to ingest, process, and transform data from various sources.
- Optimize data storage and processing using cloud-native services and technologies such as AWS S3, AWS Glue, Azure Data Lake Storage, Azure Data Factory, etc.
- Implement and manage data warehouse solutions using Snowflake, including schema design, query optimization, and performance tuning.
- Collaborate with cross-functional teams to understand data requirements and translate them into scalable and efficient technical solutions.
- Ensure data quality and integrity by implementing data validation, cleansing, and transformation processes.
- Develop and maintain ETL processes for data integration and migration between different data sources and platforms.
- Implement and enforce data governance and security practices, including access control, encryption, and compliance with regulations.
- Collaborate with data scientists and analysts to support their data needs and enable advanced analytics and machine learning initiatives.
- Monitor and troubleshoot data pipelines and systems to identify and resolve performance issues or data inconsistencies.
- Stay updated with the latest advancements in cloud technologies, data engineering best practices, and emerging trends in the industry.
Requirements:
- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
- Minimum of 4 years of experience as a Data Engineer, with a focus on AWS, Azure, and Snowflake.
- Strong proficiency in data modelling, ETL development, and data integration.
- Expertise in cloud platforms such as AWS and Azure, including hands-on experience with data storage and processing services.
- In-depth knowledge of Snowflake, including schema design, SQL optimization, and performance tuning.
- Experience with scripting languages such as Python or Java for data manipulation and automation tasks.
- Familiarity with data governance principles and security best practices.
- Strong problem-solving skills and ability to work independently in a fast-paced environment.
- Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders.
- Immediate joiner or notice period less than 15 days preferred.
If you possess the required skills and are passionate about leveraging AWS, Azure, and Snowflake to build scalable data solutions, we invite you to apply. Please submit your resume and a cover letter highlighting your relevant experience and achievements in the AWS, Azure, and Snowflake domains.

Java with cloud
|
Core Java, SpringBoot, MicroServices |
|
- DB2 or any RDBMS database application development |
|
- Linux OS, shell scripting, Batch Processing |
|
- Troubleshooting Large Scale application |
|
- Experience in automation and unit test framework is a must |
|
- AWS Cloud experience desirable |
|
- Agile Development Experience |
|
- Complete Development Cycle ( Dev, QA, UAT, Staging) |
|
- Good Oral and Written Communication Skills |
Greetings from zealous services!!
We have Immediate requirements for Fresher in BPO Semi Voice Process @ Chennai
Interview Scheduled:
- Walk-in: Monday to Friday (10 AM to 4 PM)
Contact Details:
- Reference Name: PRAVALIKA- HR
Job Description:
- Process : International Semi Voice Process
- Designation: Customer Support Executive
- Qualification : Any graduation (Arts & Science, Engineering) / Diploma / ITI / 12th / Distance Education / Dropout ( 2017- 2021 passed outs only)
- Shift Timings: Night Shift (6.30PM - 4.30AM)
- Salary: 8,500 + Incentives Upto 3000(based on your performance) +Food & Refreshment
- Age limit : 18 - 26 Yrs.
- Work Location: Nungambakkam, Chennai.
Do mention as "PRAVALIKA -HR" on the right top of your resume while coming for the walk in. Also bring a copy of your AADHAAR card xerox copy. Along with educational documents for verification.
Note: Interested candidates can kindly walk in to the below mentioned venue.
Venue:
Zealous Services Pvt.Ltd ,
No:17/7, A Block ground floor
NRCS Towers, Kodambakkam high road
Nungambakkam, Chennai-60034
Landmark: Near to Palmgrove Hotel and News7 Channel
Refer to your Friends too!!! Spot Offer!!! Don't Miss It!!!
Work Location: Hyderabad
Experience:4 to 6 Years
Package:Upto 10 LPA
Notice Period:Immediate Joiners
Its a Full Time Opportunity with Our Client
Mandatory Skills:Skills :-Nodejs,Angular & Meanstack
Responsibilities:
--Strong hands on experience in latest version of AngularJS, Angular, NodeJS, Microservices, MongoDB.
--Experience in design and architecting portal solution • Strong hold on JavaScript
--Hands on experience in creating Microservices using NodeJS and ExpressJS.
--Perform 3rd party API integration with NodeJS.
--Hands on experience in HTML5, CSS3, JS libraries like Bootstrap, JQuery, Angular Material.
--Exposure working in an agile development environment.
--Strong analytical and problem-solving skills
--Knowledge of SVN/Git repository
--Good Communication and client facing experience.
We have openings for MERN Stack Developer.
Job Description : We are looking for developers with a minimum of 4 years of experience as MERN Stack Developer
Locations : Bangalore/Mumbai/Hyderabad/Kolkata/Delhi
Skills :
- Should have work experience in a MERN Stack Developer
- Experience on NodeJS, ExpressJS, or Restify, ReactJS.
- Experience creating front end applications using HTML, CSS.
- Hands-on experience with JavaScript Development on both client and server-side
- Experience with modern frameworks and design patterns, minimum one-year experience with MERN Full-stack paradigm
- Knowledge of the following will be considered as an advantage :
1. Creating secure RESTful-based web services in XML and JSON, JavaScript, JQuery
2. Continuous integration (Jenkins/Hudson) and version control (SVN, Git) o Also have Onsite opportunity.










