Network Lead
NW JD Network Lead designs, plans and develops Enterprise network device configurations, End Point connectivity and Network security participating in service flow discussions and deployment architecture. Establishes monitoring, maintenance procedures, support, and optimization of all network hardware, software, and communication links. This position will be working on an engineering and project team providing solutions and new implementations. Design cost-effective data communications solutions to meet new and evolving requirements Plan and execute the installation of new networking components to fulfil business requirements Support the design and analysis of network improvements and enhancements. Provide analysis to support performance tuning and capacity planning Provide troubleshooting and problem solving support for network issues. Monitor system hardware and software and manage system performance Determine capacity and performance planning goals. Maintain documentation of network configuration and records relating to network hardware and software Manage projects directly related to the area of assigned responsibility and other general network projects. Serve as technical consultant on new request from client. Technical expertise to other network engineers and IT department team members. Efficiently manages work across multiple networks and related systems. Preferred Skills: Certifications Industry Certification: CCNP or CCIE Expert knowledge of Network Systems and emerging technologies, experience on projects around Cloud and SAAS platforms. Advancing knowledge in other areas of Infrastructure Design and Management Advanced Knowledge of support and administration of Network Management tools Experience supporting MDM, Firewalls, Load Balancers, Riverbed WAN Optimizers & Infoblox technology.
Desired Skills: CCNP/CCIE, Strong Routing & Switching background (8 years+ experience) Strong Understanding of L2/L3 switching and MPLS (WAN) technologies Experience with Firewalls & Load Balancer technologies Experience with Forescout NAC & Infoblox Experience with Network monitoring tools (PRTG) Experience with optimization appliances Experience with scripting and automation. Product knowledge on Citrix, CheckPoint. Exposure to SSL, direct TLS and Mutual TLS
Similar jobs
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Data Science:
• Python expert level, Analytical, Different models works, Basic concepts, CPG(Domain).
• Statistical Models & Hypothesis , Testing
• Machine Learning Important
• Business Understanding, visualization in Python.
• Classification, clustering and regression
•
Mandatory Skills
• Data Science, Python, Machine Learning, Statistical Models, Classification, clustering and regression
Key Responsibilities:
● Design, develop, and maintain applications using .NET Core, ensuring optimal
performance and scalability.
● Write clean, maintainable, and efficient code, adhering to coding standards and best
practices.
● Diagnose and resolve technical issues, optimizing applications for maximum speed and
scalability.
Technical Skills:
● Proficiency in .NET Core,C#,Web API,EF,LINQ and related technologies.
● Experience in developing applications using microservice architecture
● Experience with cloud platforms
● Familiarity with front-end technologies (e.g., Angular, React) is a plus.
● Knowledge of version control systems like Git.
● Demonstrated proficiency in database design and the ability to write complex queries
● Proven experience in leading development teams and driving technical initiatives.
Designation: Business Development Executive
Experience: 0 to 10 years
Salary: As per Company standards.
Joining: Immediately
Job Location: Pune.
Job description:
- Explain about admission procedures and courses offered to the students and their parents through phone calls.
- Counsel Students and Parents about our courses.
- Interaction with Students and Parents on regular basis
- Monitor class schedules to ensure smooth running of classes.
- Support the students throughout the admission process by answering their queries.
- Maintain the target metrics by converting potential students into confirmed admission and succeed in achieving the performance goals.
- Maintain regular communication with students, parents for coordinating admission activities and resolving problems.
- Result-orientated and able to work under pressure to achieve targets.
- Excellent command of spoken and written English as well as the local language.Strong, presentation and persuasion skills.
- Arranging seminars and workshops.
- Report daily, weekly and monthly status.
- Should be target oriented and enthusiastic towards the role.
- Participating in educational fairs and events.
- An ability to communicate effectively with colleagues, students and other members of the public of all age groups.
- Ability to learn on own initiative and research best study options for students.
- Flexibility over working hours
- Must Have Pleasing Personality , Excellent Communication and Soft Spoken.
Headquartered in Redwood city, CA, our client is a communication and assessment SaaS startup that enables the movement health professional (athletic trainer, physical therapist, recovery specialist) to seamlessly capture and assess patient care data through cutting edge technologies such as machine learning and AI. It improves outcomes for healthcare organisations by identifying early signs of clinical deterioration in chronically-ill patients thereby decreasing hospital admissions and reducing unnecessary spending.
This revolutionary startup has raised $3 Mn in a seed funding round led by top investors. It is all set to democratise the accessibility and affordability of movement health through its full stack digital health platform.
What you will do:
- Handling multiple projects end to end across multiple domains with cross-functional teams (Engineering, Quality, Design, etc)
- Analyzing business requirements preparing project plan ensuring the smooth delivery of projects and day to day project management activities
- Participating in client calls to discuss the nature, urgency and root cause of the problems and appropriately taking decisions in collaboration with clients
- Identifying project roadblocks, risks, and mitigation plan, keeping track of tasks, defects assigned to the team
- Working closely with technical development teams on estimation, technical assessments and setting up backlogs/project plan
- Hosting internal and external customer demos at various points along with managing project status on Project Management tools and collaboration tools
- Identifying scope change and leading efforts to develop, estimate, and communicate budget and schedule impacts
- Working with management on resource allocation, feedback, and process improvements
- Driving prompt resolution of customer issues and ensuring high levels of customer satisfaction
- Selecting appropriate project management approaches (Agile and/or Waterfall) based on customer and project requirements
Desired Candidate Profile
What you need to have:- Bachelor of Technology in Computer Science/ Engineering/ MCA
- 4+ years of experience on Project/Program Management roles working on managing complete development cycle of SaaS products/applications
- Certification in Scrum Master
- Knowledge of Project Management tools such as JIRA
- Software Development or QA Engineer background is a plus
- Ability to transform raw thoughts into clear technical documentation and/or direction
- Problem-solving capabilities with the ability to identify root cause, patience and commitment to solve
- Excellent verbal and written communication, presentation and articulation skills
Job Description - Data Engineer
About us
Propellor is aimed at bringing Marketing Analytics and other Business Workflows to the Cloud ecosystem. We work with International Clients to make their Analytics ambitions come true, by deploying the latest tech stack and data science and engineering methods, making their business data insightful and actionable.
What is the role?
This team is responsible for building a Data Platform for many different units. This platform will be built on Cloud and therefore in this role, the individual will be organizing and orchestrating different data sources, and
giving recommendations on the services that fulfil goals based on the type of data
Qualifications:
• Experience with Python, SQL, Spark
• Knowledge/notions of JavaScript
• Knowledge of data processing, data modeling, and algorithms
• Strong in data, software, and system design patterns and architecture
• API building and maintaining
• Strong soft skills, communication
Nice to have:
• Experience with cloud: Google Cloud Platform, AWS, Azure
• Knowledge of Google Analytics 360 and/or GA4.
Key Responsibilities
• Work on the core backend and ensure it meets the performance benchmarks.
• Designing and developing APIs for the front end to consume.
• Constantly improve the architecture of the application by clearing the technical backlog.
• Meeting both technical and consumer needs.
• Staying abreast of developments in web applications and programming languages.
Key Responsibilities
• Design and develop platform based on microservices architecture.
• Work on the core backend and ensure it meets the performance benchmarks.
• Work on the front end with ReactJS.
• Designing and developing APIs for the front end to consume.
• Constantly improve the architecture of the application by clearing the technical backlog.
• Meeting both technical and consumer needs.
• Staying abreast of developments in web applications and programming languages.
What are we looking for?
An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.
• Education - BE/MCA or equivalent.
• Agnostic/Polyglot with multiple tech stacks.
• Worked on open-source technologies – NodeJS, ReactJS, MySQL, NoSQL, MongoDB, DynamoDB.
• Good experience with Front-end technologies like ReactJS.
• Backend exposure – good knowledge of building API.
• Worked on serverless technologies.
• Efficient in building microservices in combining server & front-end.
• Knowledge of cloud architecture.
• Should have sound working experience with relational and columnar DB.
• Should be innovative and communicative in approach.
• Will be responsible for the functional/technical track of a project.
Whom will you work with?
You will closely work with the engineering team and support the Product Team.
Hiring Process includes :
a. Written Test on Python and SQL
b. 2 - 3 rounds of Interviews
Immediate Joiners will be preferred
Exp - 5 - 9 years
Skills : Java, Spring MVC, Spring Boot, HTML, CSS, JavaScript, Angular.js, RESTful APIs
- Looking for Senior Java Developers with strong core java skills and event-driven - microservices development.
- Exp. in Banking Domain added as advantage
Responsibilities:
1. Experience working in Lean and agile environments and practising Scrum
2. Designing of robust and scalable solutions to support Enterprise application
3. Develop and support middle ware applications using Java / Open Source
4. Provide second level support to fixes and solutions for issues in production
5. Enhance and maintain applications and components for Liquidity functions.
6. Experience in Designing scalable solutions
7. Engineering architecture, platform configuration, and documentation experience.
8. Knowledge of modern development lifecycles, such as Agile and iterative development
9. Knowledge of Biodata with HBase and spark is an added advantage
10. Collaborate with BA, QA and PM team members for effective testing and delivery.
Requirements :
1. 5+ years of experience in core Java
2. Experience in Spring MVC, Spring Boot
3. Strong database and data management experience in relational databases
4. Excellent verbal and written communication skills in English Desired Qualifications
5. Knowledge of HTML, CSS, JavaScript, Angular.js
6. Experience in event-driven workflow processing using Kafka, ActiveMQ, or Netflix OSS or similar
7. Exp. in Banking Domain added as advantage
- Meeting with the development team to discuss user interface ideas and applications.
- Reviewing application requirements and interface designs.
- Identifying web-based user interactions.
- Developing and implementing highly responsive user interface components using react concepts.
- Writing application interface codes using JavaScript following react.js workflows.
- Troubleshooting interface software and debugging application codes.
ABOUT THE COMPANY
Korcomptenz, being a Microsoft Gold Certified Partner, is vested in Microsoft Mission of empowering every client to achieve more. Our “Purpose” is to help our clients to embark and accelerate their Digital Transformation journey. Our Core Principals are anchored on growth mindset, customer-centricity, and a deep commitment to our team.
At Korcomptenz, Marketing is an Art and Science. With a rich pedigree, our branding is evolving as we continuously work on “our key focus” through research, focus groups, and interviews inside and outside the company. Our marketing team is busy in creating meaningful interactions in every conversation we have with our audience, customers and as a company. We have an obligation to creating the right customer experiences and carry our leadership’s mission of what we stand for. This means that we constantly huddle to think on “Why” and “How” of what we do at work everyday. We are continuously having creative and tactical conversations on how to “value add” and continue to upgrade our skills for modern marketing. The destination is important but so is the journey….Join us to experience the journey and let’s get to the destination together !
Why you should join our KOR team?
- Korcomptenz offers a culture that is Holistic, Transformational, Entrepreneurial and Family-like
- We put "People First” – whether it is our clients or our team, we strive to work on and for "What Matters”
- Compelling opportunities to be creative and leverage your critical thinking skills in addition to technology expertise
- State-of-the-art infrastructure, transparency, constant training for all team members
- Flexible locations and work-from-home options based on roles
- Opportunities to work abroad and onsite
Paid Social Specialist ( Excellent in LinkedIn and Google AdWords)
Roles and Responsibilities
- Responsible for Performance Marketing and Lead Generation through PPC Program
- Provide insights into the best strategies and tactics to power up SEO strategy on traffic and keyword ranking
- Plan, execute, manage and optimize branding, new acquisition, and account based campaigns on LinkedIn and Google AdWords.
- Maximize ROI on Ad Spends. Plan and execute landing pages, lead gen forms, ad copy, monitor campaign performance
- Expert on Display, Search, Video, Apps Ad Campaigns and Performance Reporting
- Experience in B2B and Account based marketing for US market is a must have
- Experience and working knowledge with CRM like HubSpot and MSCRM.
- Important to have knowledge of SEO and Marketing Concepts, Technology especially ERP/CRM selling is a BONUS.
- Excellent communication and editorial writing skills and should be able to work independently
Qualifications
- Minimum 6-12 months experience running LinkedIn and Google Ads Campaigns