50+ Linux/Unix Jobs in Pune | Linux/Unix Job openings in Pune
Apply to 50+ Linux/Unix Jobs in Pune on CutShort.io. Explore the latest Linux/Unix Job opportunities across top companies like Google, Amazon & Adobe.
Job Description:
- 3+ years of experience in Functional testing with good foundation in technical expertise
- Experience in Capital Markets/Investment Banking domain is MUST
- Exposure to API testing tools like SoapUI and Postman
- Well versed with SQL
- Hands on experience in debugging issues using Unix commands
- Basic understanding of XML and JSON structures
- Knowledge of FitNesse is good to have
Location:
Pune/Mumbai
About Wissen Technology:
· The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.
· Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.
· Our workforce consists of 550+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
· Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.
· Globally present with offices US, India, UK, Australia, Mexico, and Canada.
· We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
· Wissen Technology has been certified as a Great Place to Work®.
· Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.
· Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.
· We have served clients across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include Morgan Stanley, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.
Website : www.wissen.com
About Lean Technologies
Lean is on a mission to revolutionize the fintech industry by providing developers with a universal API to access their customers' financial accounts across the Middle East. We’re breaking down infrastructure barriers and empowering the growth of the fintech industry. With Sequoia leading our $33 million Series A round, Lean is poised to expand its coverage across the region while continuing to deliver unparalleled value to developers and stakeholders.
Join us and be part of a journey to enable the next generation of financial innovation. We offer competitive salaries, private healthcare, flexible office hours, and meaningful equity stakes to ensure long-term alignment. At Lean, you'll work on solving complex problems, build a lasting legacy, and be part of a diverse, inclusive, and equal opportunity workplace.
About the role:
Are you a highly motivated and experienced software engineer looking to take your career to the next level? Our team at Lean is seeking a talented engineer to help us build the distributed systems that allow our engineering teams to deploy our platform in multiple geographies across various deployment solutions. You will work closely with functional heads across software, QA, and product teams to deliver scalable and customizable release pipelines.
Responsibilities
- Distributed systems architecture – understand and manage the most complex systems
- Continual reliability and performance optimization – enhancing observability stack to improve proactive detection and resolution of issues
- Employing cutting-edge methods and technologies, continually refining existing tools to enhance performance and drive advancements
- Problem-solving capabilities – troubleshooting complex issues and proactively reducing toil through automation
- Experience in technical leadership and setting technical direction for engineering projects
- Collaboration skills – working across teams to drive change and provide guidance
- Technical expertise – depth skills and ability to act as subject matter expert in one or more of: IAAC, observability, coding, reliability, debugging, system design
- Capacity planning – effectively forecasting demand and reacting to changes
- Analyze and improve efficiency, scalability, and stability of various system resources
- Incident response – rapidly detecting and resolving critical incidents. Minimizing customer impact through effective collaboration, escalation (including periodic on-call shifts) and postmortems
Requirements
- 10+ years of experience in Systems Engineering, DevOps, or SRE roles running large-scale infrastructure, cloud, or web services
- Strong background in Linux/Unix Administration and networking concepts
- We work on OCI but would accept candidates with solid GCP/AWS or other cloud providers’ knowledge and experience
- 3+ years of experience with managing Kubernetes clusters, Helm, Docker
- Experience in operating CI/CD pipelines that build and deliver services on the cloud and on-premise
- Work with CI/CD tools/services like Jenkins/GitHub-Actions/ArgoCD etc.
- Experience with configuration management tools either Ansible, Chef, Puppet, or equivalent
- Infrastructure as Code - Terraform
- Experience in production environments with both relational and NoSQL databases
- Coding with one or more of the following: Java, Python, and/or Go
Bonus
- MultiCloud or Hybrid Cloud experience
- OCI and GCP
Why Join Us?
At Lean, we value talent, drive, and entrepreneurial spirit. We are constantly on the lookout for individuals who identify with our mission and values, even if they don’t meet every requirement. If you're passionate about solving hard problems and building a legacy, Lean is the right place for you. We are committed to equal employment opportunities regardless of race, color, ancestry, religion, gender, sexual orientation, or disability.
Skill Set : Middleware application support :
Tomcat, Apache , Jenkin CI /CD pine ,monitoring on Linux on premise Servers (not cloud experience)
About the job
MangoApps builds enterprise products that make employees at organizations across the globe
more effective and productive in their day-to-day work. We seek tech pros, great
communicators, collaborators, and efficient team players for this role.
Job Description:
Experience: 5+yrs (Relevant experience as a SRE)
Open positions: 2
Job Responsibilities as a SRE
- Must have very strong experience in Linux (Ubuntu) administration
- Strong in network troubleshooting
- Experienced in handling and diagnosing the root cause of compute and database outages
- Strong experience required with cloud platforms, specifically Azure or GCP (proficiency in at least one is mandatory)
- Must have very strong experience in designing, implementing, and maintaining highly available and scalable systems
- Must have expertise in CloudWatch or similar log systems and troubleshooting using them
- Proficiency in scripting and programming languages such as Python, Go, or Bash is essential
- Familiarity with configuration management tools such as Ansible, Puppet, or Chef is required
- Must possess knowledge of database/SQL optimization and performance tuning.
- Respond promptly to and resolve incidents to minimize downtime
- Implement and manage infrastructure using IaC tools like Terraform, Ansible, or Cloud Formation
- Excellent problem-solving skills with a proactive approach to identifying and resolving issues are essential.
Greeting...!!!
Are you looking for an Immediate job offer?
Job Title: SAP Consultant / Sr. Consultant - Basis
Location: Baner, Pune (WFO)
Experience: 6+ years
Must Have: Implementation and Support project exp
***Only Immediate or serving notice will consider**
Key Responsibilities:
- Implement and support SAP Basis/ALM solutions
- Provide technical expertise in a customer-facing role
- Analyze complex requirements and provide system solutions
- Design, customize, configure, and test SAP systems
- Act as a liaison between client stakeholders and the technical team
- Perform performance monitoring, tuning, and problem resolution
- Assist in SAP upgrades, system refresh, and system copy activities
- Manage SAP parameter changes, operation modes, and logon groups
- Administer SAP databases (HANA, MSSQL, DB2, Oracle)
- Handle incident, problem, and change management processes
- Provide ad-hoc training and user support as needed
- Mentor junior team members and document processes and solutions
Required Knowledge/Skills:
- 6-8 years of experience as a Basis consultant handling ECC/S4HANA on Linux and Windows
- Experience with mid-sized SAP upgrades and system refresh/copy
- Knowledge of AS JAVA Stack and SAP Fiori
- HANA installation and administration experience
- Proficiency in add-on and support pack application and troubleshooting
- TMS administration, configuration, and troubleshooting
- SAP role-based authorization knowledge
- Spool administration and troubleshooting printer issues
- Ability to work in shifts and handle high-priority issues
- Strong understanding of incident, problem, and change management processes
Qualifications:
- Degree or similar qualification
- SAP certification preferred
Interested candidates, please send your resume and cover letter to [contact email] with the subject line "SAP Consultant / Sr. Consultant - Basis Application."
Regards,
Dimple Pal | Talent Acquisition - Executive
Linkedin - www.linkedin.com/in/dimple-pal-218b94147
Job Description:
· Proficient In Python.
· Good knowledge of Stress/Load Testing and Performance Testing.
· Knowledge in Linux.
at Wissen Technology
JD - API + Mobile testing (IOS)
- Strong experience in Manual Testing of Enterprise Class and Financial application and web portals.
- Expert in UI testing on multiple platforms and browsers, and API testing using Postman.
- Experience in UI Automation using tools like Selenium web driver and API Automation using Rest Assured framework with Java.
- Good Experience in Mobile App testing on iOS platform.
- Good to have exposure to Automation testing of mobile apps via Appium Exposure and know-how of performance testing and security testing
- tools.
- Good experience in understanding the requirements, deriving the goals and taking ownership of tasks.
- Knowledge of AWS / Cloud would be a plus.
- Should be self-driven engineer with desire to use practical and professional concepts in QA along with application of QA standards and
- procedures to resolve routine issues.
- Should be able to write use cases based on product requirements, execute them and report issues in bug tracking system.
- Should be self-starter, lead and self-contributing to the QA team. Helps QA community to learn Automation, impart necessary technical
- knowledge. Improve process and quality via test and process automation: defining right strategy and technology based on process and
- architecture assessment by engaging different roles and stakeholders.
- Deep practical experience with cutting edge tools for Web, Mobile, Desktop, DB and Web-service testing (Selenium, Katalon, Ready API/
- Postman etc.)
- Experience using SQL regarding writing and understanding queries and procedures.
Responsibilities:
- Take ownership of QA requirements and provide testing guidance to the technology team Lead and coordinate the application enhancement
- and tech ops testing activities with technology and business teams
- Should understand the requirement, design and develop the automation test cases
- Define and establish test strategy and process
- Understand complex nature of the application and come up with the test plan
- Should also work as Individual Contributor
- Participate in team communication and collaborate in planning activities, including stand-ups, iteration planning meetings (IPM), and
- retrospectives.
- Manage and communicate regular updates on project status (e.g., work completed, work in progress, next steps, risks, quality, KPIs, and
- costs) to stakeholders, peers, Product managers, QA Manager, and others.
Qualifications:
- Very good hands-on and good knowledge of backend testing procedures, API testing and UI Testing.
- Exposure to test management and bug tracking tool (like ALM, Testrail, X-Ray, JIRA or others), Agile methodologies.
- Knowledge regarding financial services, and workflows, payment gateways, e-wallets etc
3-6 years of experience in Functional testing with good foundation in technical expertise.
Experience in Capital Markets domain is MUST.
Exposure to API testing tools like SoapUI and Postman.
Well versed with SQL
Hands on experience in debugging issues using Unix commands.
Basic understanding of XML and JSON structures
Knowledge of Finesse is good to have
Who are we and what do we do?
The Dice platform, one of the most advanced SaaS fintech organizations, helps businesses convert their spending from many fragmented applications, prepaid cards, offline reimbursement/invoice management, procurement, and payment systems to a single spend platform. There will be no more cash, cards, refunds, or offline invoicing since our entire goal is to provide businesses more knowledge and control over how they spend their money, resulting in significant cost savings and profitability. We are a hyper growth startup collaborating with premium Indian enterprises, startups, and industry leaders across disciplines.
Perks in store for you when you join the team:
- You'll be surrounded by passionate team members.
- Your work will have a visible impact.
- You will be working on interesting technical challenges in a fast-paced environment.
Requirements and skills:
- Hands-on Software Development experience.
- 2-4 years of relevant experience in Java development.
- Hands-on experience in designing and developing applications using Java EE platforms.
- Object-Oriented Analysis and design using common design patterns.
- Profound insight of Java and Java EE (Multithreading, Reactive Programming, etc)
- Excellent knowledge of Relational Databases, SQL, and ORM technologies (MySQL, EBean)
- Excellent knowledge in RESTful API development, event-based processing.
- Experience with test-driven development.
- Knowledge of GIT, Linux, Docker, Redis
Responsibilities:
- Designing, implementing, and maintaining Java applications that are often high-volume and low-latency, required for mission-critical systems
- Delivering high availability and performance
- Contributing in all phases of the development lifecycle
- Writing well-designed, efficient, and testable code
- Conducting software analysis, programming, testing, and debugging
- Managing Java and Java EE application development
- Ensuring designs comply with specifications
- Preparing and producing releases of software components
- Transforming requirements into stipulations
- Support continuous improvement
- Investigating alternatives and technologies
Location:
Pune
Job Title: C++ Buffer Developer
Location: Pune, India
Experience: 2-3 years
Salary: 8 LPA
Notice Period: 0-15 days
Job Description:
We are seeking a skilled and passionate C++ Buffer Developer to join our team in Pune. As a C++ Buffer Developer, you will be responsible for designing, developing, and maintaining high-performance buffer systems for our software applications. You will collaborate with cross-functional teams to analyze requirements, implement solutions, and ensure the overall quality of the software.
Responsibilities:
- Design, develop, and maintain C++ buffer systems to meet the requirements of our software applications.
- Collaborate with cross-functional teams, including software engineers, designers, and product managers, to understand project requirements and deliver high-quality solutions.
- Write clean, efficient, and maintainable code following best practices and coding standards.
- Conduct thorough testing and debugging to ensure the stability and performance of the buffer systems.
- Optimize and enhance existing code to improve overall system efficiency and performance.
- Participate in code reviews to provide and receive constructive feedback for continuous improvement.
- Stay up-to-date with the latest industry trends and technologies related to C++ programming and buffer systems.
- Document the design, implementation, and maintenance of the buffer systems for future reference.
Requirements:
- Bachelor's degree in Computer Science, Software Engineering, or a related field.
- 2-3 years of hands-on experience in C++ programming, specifically in designing and developing buffer systems.
- Strong knowledge of data structures, algorithms, and object-oriented programming principles.
- Proficiency in using C++11 or higher versions.
- Experience with memory management techniques and performance optimization.
- Familiarity with Linux/Unix environments and development tools.
- Good understanding of software development lifecycle and agile methodologies.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
- Ability to work independently and handle multiple tasks simultaneously.
- Knowledge of network protocols and socket programming is a plus.
- Experience with version control systems (e.g., Git) is preferred.
If you are a talented C++ developer with a passion for buffer systems and want to contribute to the success of our software applications, we would love to hear from you. Apply now and join our dynamic team in Pune!
Note: The salary mentioned is as per the budget and may be subject to negotiation based on the candidate's skills and experience.
L2 Support
Location : Mumbai, Pune, Bangalore
Requirement details : (Mandatory Skills)
- Excell communication skills
- Production Support, Incident Management
- SQL ( Must have experience in writing complex queries )
- Unix ( Must have working experience on Linux operating system.
- Pearl/Shell Scripting
- Candidates working in the Investment Banking domain will be preferred
Responsibilities:
• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing
• Implementing Spark processing based ETL frameworks
• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
• Modifying the Informatica-Teradata & Unix based data pipeline
• Enhancing the Talend-Hive/Spark & Unix based data pipelines
• Develop and Deploy Scala/Python based Spark Jobs for ETL processing
• Strong SQL & DWH concepts.
Preferred Background:
• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs
• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives
• Understanding of EDW system of business and creating High level design document and low level implementation document
• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document
• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Requirements
• Extensive and expert programming experience in at least one general programming language (e. g.
Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code.
• Experience with multi-threading and concurrency programming.
• Extensive experience in object oriented design skills, knowledge of design patterns, and a huge passion
and ability to design intuitive modules and class-level interfaces.
• Excellent coding skills - should be able to convert design into code fluently.
• Knowledge of Test Driven Development.
• Good understanding of databases (e. g. MySQL) and NoSQL (e. g. HBase, Elasticsearch, Aerospike etc).
• Strong desire to solve complex and interesting real world problems.
• Experience with full life cycle development in any programming language on a Linux platform.
• Go-getter attitude that reflects in energy and intent behind assigned tasks.
• Worked in a startup-like environment with high levels of ownership and commitment.
• BTech, MTech or Ph. D. in Computer Science or related technical discipline (or equivalent).
• Experience in building highly scalable business applications, which involve implementing large complex
business flows and dealing with huge amounts of data.
• 3+ years of experience in the art of writing code and solving problems on a large scale.
• Open communicator who shares thoughts and opinions frequently, listens intently, and takes
constructive feedback.
About us:
Arista Networks was founded to pioneer and deliver software driven cloud networking solutions for large datacenter storage and computing environments. Arista's award-winning platforms, ranging in Ethernet speeds from 10 to 400 gigabits per second, redefine scalability, agility and resilience. Arista has shipped more than 20 million cloud networking ports worldwide with CloudVision and EOS, an advanced network operating system. Committed to open standards, Arista is a founding member of the 25/50GbE consortium. Arista Networks products are available worldwide directly and through partners.
About the job
Arista Networks is looking for world-class software engineers to join our Extensible Operating System (EOS) software development team.As a core member of the EOS team, you will be part of a fast-paced,high caliber team-building features to run the world's largest data center networks.Your software will be a key component of Arista's EOS, Arista's unique, Linux-based network operating system that runs on all of Arista's data center networking products.
The EOS team is responsible for all aspects of the development and delivery of software meant to run on the various Arista switches.You will work with your fellow engineers and members of the marketing team to gather and understand the functional and technical requirements for upcoming projects.You will help write functional specifications, design specifications, test plans, and the code to bring all of these to life.You will also work with customers to triage and fix problems in their networks. Internally, you will develop automated tests for your software, monitor the execution of those tests, and triage and fix problems found by your tests.At Arista, you will own your projects from definition to deployment, and you will be responsible for the quality of everything you deliver.
This role demands strong and broad software engineering fundamentals, and a good understanding of networking including capabilities like L2, L3, and fundamentals of commercial switching HW.Your role will not be limited to a single aspect of EOS at Arista, but cover all aspects of EOS.
Responsibilities:
- Write functional specifications and design specifications for features related to forwarding traffic on the internet and cloud data centers.
- Independently implement solutions to small-sized problems in our EOS software, using the C, C++, and python programming languages.
- Write test plan specifications for small-sized features in EOS, and implement automated test programs to execute the cases described in the test plan.
- Debug problems found by our automated test programs and fix the problems.
- Work on a team implementing, testing, and debugging solutions to larger routing protocol problems.
- Work with Customer Support Engineers to analyze problems in customer networks and provide fixes for those problems when needed in the form of new software releases or software patches.
- Work with the System Test Engineers to analyze problems found in their tests and provide fixes for those problems.
- Mentor new and junior engineers to bring them up to speed in Arista’s software development environment.
- Review and contribute to the specifications and implementations written by other team members.
- Help to create a schedule for the implementation and debugging tasks, update that schedule weekly, and report it to the project lead.
Qualifications:
- BS Computer Science/Electrical Engineering/Computer Engineering 3-10 years experience, or MS Computer Science/Electrical Engineering/Computer Engineering + 5 years experience, Ph.D. in Computer Science/Electrical Engineering/Computer Engineering, or equivalent work experience.
- Knowledge of C, C++, and/or python.
- Knowledge of UNIX or Linux.
- Understanding of L2/L3 networking including at least one of the following areas is desirable:
- IP routing protocols, such as RIP, OSPF, BGP, IS-IS, or PIM.
- Layer 2 features such as 802.1d bridging, the 802.1d Spanning Tree Protocol, the 802.1ax Link Aggregation Control Protocol, the 802.1AB Link Layer Discovery Protocol, or RFC 1812 IP routing.
- Ability to utilize, test, and debug packet forwarding engine and a hardware component’s vendor provided software libraries in your solutions.
- Infrastructure functions related to distributed systems such as messaging, signalling, databases, and command line interface techniques.
- Hands on experience in the design and development of ethernet bridging or routing related software or distributed systems software is desirable.
- Hands on experience with enterprise or service provider class Ethernet switch/router system software development, or significant PhD level research in the area of network routing and packet forwarding.
- Applied understanding of software engineering principles.
- Strong problem solving and software troubleshooting skills.
- Ability to design a solution to a small-sized problem, and implement that solution without outside help.Able to work on a small team solving a medium-sized problem with limited oversight.
Resources:
- Arista's Approach to Software with Ken Duda (CTO): https://youtu.be/TU8yNh5JCyw
- Additional information and resources can be found at https://www.arista.com/en/
About us:
Arista Networks was founded to pioneer and deliver software driven cloud networking solutions for large datacenter storage and computing environments. Arista's award-winning platforms, ranging in Ethernet speeds from 10 to 400 gigabits per second, redefine scalability, agility and resilience. Arista has shipped more than 20 million cloud networking ports worldwide with CloudVision and EOS, an advanced network operating system. Committed to open standards, Arista is a founding member of the 25/50GbE consortium. Arista Networks products are available worldwide directly and through partners.
About this role:
- You will be working with the WiFi team at Arista, developing cutting edge and next generation WiFi solutions in a fast-paced environment. The WiFi team is responsible for the end to end development of the Cloud managed WiFi product portfolio of Arista. This specific position is for the WiFi AccessPoint team.
- As a core member of the AccessPoint team, you will be working closely with relevant teams to understand product requirements, design the solution, build the software and deliver it for final validation and customer deployment.
- You will also keep track of new and emerging technologies and their impact on Arista products, come up with new and innovative ideas to improve and differentiate the product and help Arista become a leading player in the Campus space.
- You will work closely with sales and support teams to push new solutions, understand customer needs and pain points and help resolve escalations.
- You will not be limited to a single aspect of the product, it will be broad encompassing many different aspects including but not limited to developing new Access Points, designing and implementing new features, tracking new technologies and working closely with the sales and customer teams.
Requirements:
• Strong engineering and Computer Science fundamentals
• Expected to have a strong background in software development and good understanding of systems and networking areas with the knowledge of the WiFi area as an added bonus.
• Minimum 4+ years of relevant experience
• Well versed with programming in one of C/C++ languages
• Experience working in a Linux environment, developing applications or Linux drivers
• Proven experience in any of the below:
- Network device drivers, operating system internals, Kernels, compilers, SOC architecture
- Experience in developing Wi-Fi features (802.11), WLAN MAC Protocol, system integration, evaluate various performance parameters.
- User space development for connectivity related products (Wireless Lan access points/ controllers, networking equipment) in one or more of following areas:
• HostAPD, Portal, RADIUS, AAA, Identity and role management, Radsec
• Tunnels, Firewall, Iptables, Flow Classification, QoS, TLS, DTLS Preferred Skills
• Experience with Wi-Fi device drivers on Linux.
• Hands-on experience in working with one or more WIFI chipset platforms
• Good System Level understanding of the Wireless AP functionality
• Experience in developing Wi-Fi features, system integration, evaluate various performance parameters
Resources:
- Arista Cognitive WiFi : https://www.arista.com/en/products/cognitive-wifi https://youtu.be/cT1INdR-xHQ https://www.youtube.com/watch?v=olPkCOT3MdA
- Arista Cognitive WiFi Datasheet: https://www.arista.com/assets/data/pdf/Datasheets/CloudVision-Wifi-Datasheet.pdf
- Arista's Approach to Software with Ken Duda (CTO): https://youtu.be/TU8yNh5JCyw
- Additional information and resources can be found at https://www.arista.com/en/
Requirements
- 3+ years of experience in the Development of JAVA technology.
- Strong Java Basics
- Linux
- SpringBoot or Spring MVC
- Hands-on experience in Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Java 8
- Any Caching Mechanism
- Good at problem-solving
Good to Have Skills:
- 3 years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem-solving skills.
- Ability to work in a fast-paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding AI/ML algorithms is a plus.
Requirements:
- Energetic self-starter, with a desire to work in a startup environment.
- Proficient in advanced Java programming skills.
- Expert in Application development cloud/on premise end to end. Middle layer, DB layer.
- Nice to have understanding on MQ and DB
- Good hands on in Complex Event Processing systems.
- Solved scale and performance issues while dealing with huge sets of data. Pre compute or data aggregation frameworks to achieve good response time.
- Real world experience working with large datasets and NoSQL database technologies
- Experience of debugging applications running on Unix like systems (e.g. Ubuntu, CentOS)
- Experience developing RESTful APIs for complex data sets
- Knowledge of container based development & deployment (e.g. Dockers, rkt)
- Expertise in software security domain, a plus
Requirements
- Extensive and expert programming experience in at least one general programming language (e. g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code.
- Experience with multi-threading and concurrency programming.
- Extensive experience in object-oriented design skills, knowledge of design patterns, and a huge passion and ability to design intuitive modules and class-level interfaces.
- Excellent coding skills - should be able to convert the design into code fluently.
- Knowledge of Test Driven Development. Good understanding of databases (e. g. MySQL) and NoSQL (e. g. HBase, Elasticsearch, Aerospike etc).
- Strong desire to solve complex and interesting real-world problems.
- Experience with full life cycle development in any programming language on a Linux platform. Go-getter attitude that reflects in energy and intent behind assigned tasks.
- Worked in a startup-like environment with high levels of ownership and commitment.
- BTech, MTech or Ph. D. in Computer Science or related technical discipline (or equivalent).
- Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with huge amounts of data.
- 3+ years of experience in the art of writing code and solving problems on a large scale.
- An open communicator who shares thoughts and opinions frequently listens intently and takes constructive feedback
one of the world's leading multinational investment bank
- Provide hands on technical support and post-mortem root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management.
- Actively address and work on user and system tickets in the Service Now ticketing application. Create and implement change tickets for enhancements, new monitoring, and assisting development groups.
- Create, test, and implement Non-Functional Requirements (NFR) for current and new applications.
- Build up technical subject matter expertise on the applications being supported including business flows, application architecture, and hardware configuration. Maintain documentation, knowledge articles, and runbooks.
- Conduct real time monitoring to ensure application OLA/SLAs are achieved and maximum application availability (up time) using an array of monitoring tools.
- Assist in the process to approve application code releases change tickets as well as tasks assigned to the support team to perform and validate the associated implementation plan.
- Approach support with a proactive attitude, desire to seek root cause, in-depth analysis and triage, and strive to reduce inefficiencies and manual efforts.
at Altimetrik
Loc: Chennai, Bangalore,Pune,JaipurEXP: 5 yrs to 8 yrs
- Implement best practices for the engineering team across code hygiene, overall architecture design, testing, and deployment activities
- Drive technical decisions for building data pipelines, data lakes, and analyst access.
- Act as a leader within the engineering team, providing support and mentorship for teammates across functions
- Bachelor’s Degree in Computer Science or equivalent job experience
- Experienced developer in large data environments
- Experience using Git productively in a team environment
- Experience with Docker
- Experience with Amazon Web Services
- Ability to sit with business or technical SMEs to listen, learn and propose technical solutions to business problems
· Experience using and adapting to new technologies
· Take and understand business requirements and goals
· Work collaboratively with project managers and stakeholders to make sure that all aspects of the project are delivered as planned
· Strong SQL skills with MySQL or PostgreSQL
- Experience with non-relational databases and their role in web architectures desired
Knowledge and Experience:
- Good experience with Elixir and functional programming a plus
- Several years of python experience
- Excellent analytical and problem-solving skills
- Excellent organizational skills
Proven verbal and written cross-department and customer communication skills
Please find the details below
Job Description
Candidate Must Have:
● Strong knowledge of Linux and Windows OS
● Experience in JIRA/Confluence/JSD Administration is a must.
● Good knowledge of any of the Databases is required.
● Experience of working on JIRA/JSD/Confluence Datacenters
● Experience in doing minor and major upgrades with & without downtime
● Experience in Jira server to cloud Migration
● Jira CSV/Data Migration experience is must
● JIRA instance Projects merging (Server/Cloud to DC)
● The candidate should have hands-on knowledge of Atlassian Plugins like Tempo, Easy BI,
Zypher, Big picture, Configuration manager for Jira and Project configurator for JIRA
● Experience in Jira Service Management with Asset Management, CMDB and ITSM
implementation
Candidate good to have:
● An Atlassian Certification in JIRA Project Administration will be an edge.
● Installation and configuration of Atlassian applications with Linux and Windows OS
● Good knowledge of any of the Databases is required.
● Experience of working on JIRA/JSD/Confluence Datacenters
● Experience in doing minor and major upgrades with & without downtime
●
● Experience in JIRA instance Projects merging (Server/Cloud to DC)
● Experience in Confluence Spaces merging (Server/Cloud to DC)
● Perform Project migration, backup/restore, and archive activity
● Strong skills in troubleshooting JIRA and other Atlassian products.
● Work with JIRA to enable custom workflows, fields, dashboards, and reports
● Create and manage complex JIRA components including project workflows, screen
schemes, permission schemes, and notification schemes in JIRA.
● The candidate should have hands-on knowledge of Atlassian Plugins like Tempo, Easy BI,
Zypher, Big picture, Configuration manager for Jira and Project Configurator for the JIRA
● Experience in requirement gathering, conducting Workshops and activity estimations
● Bitbucket and Bamboo knowledge will be added advantage
● Experience in Jira Service Management with Asset management, CMDB and ITSM
implementation
Responsibilities
● Configuration, maintenance, and administration of Atlassian products (Jira, Confluence,
Crowd, Bamboo, Bitbucket, etc)
● Evaluate and manage the usage of Atlassian add -ons to meet the team and business
needs of the customers
● Jira Workflows, custom fields, pages, spaces, and other configurations based on the
needs.
● Atlassian JIRA/JSM/Confluence DC installation setup and maintenance on AWS/Azure
● Hardware infrastructure setup, Maintenance, and troubleshooting.
● Jira/JSM/Confluence data migration from server to cloud and one instance to another
instance
Want to work with an established & growing IT company? Join team Benison to have the right challenges that will help you accelerate your career growth to the next level, faster!
Benison Technologies was started in 2011 with a mission to revolutionize the silicon industry in India, with a host of amazing big clients like Google, Cisco, McAfee, Intel, and so on, you get to experience the best of both worlds. If you consider yourself an engineer who is capable to join our ever-growing team, then this is the right opportunity for you:
Why Benison Tech?
We have a partial acquisition from one of the biggest names in the world (well we can’t name them thanks to confidentiality) it’s one of the FAANG companies, and you can “Google” it if you like.
Oh! & one more thing, this did not happen by accident, our team put a ton of efforts to turn this gigantic dream into a reality.
Benison Tech has a consistent history of demonstrating growth through innovation time and again.
We don’t stop there, we then re-invest our profits back into the initiatives for the growth of our people, our culture and the company. Now enough with us, let’s talk about the job roles & responsibilities:
What you will be working on:
- Key contributor for developing product strategies and features.
- Software development for industries leading SaaS platform
- You will be involved closely in planning, designing, integration of client requirements.
- You will be working with one of the leaders in data resiliency and data protection.
Here are some technical skills require:
- Independently own features and create feature test plans/strategies based on development and feature completion milestones.
- Identify quality assurance process bottlenecks and suggest actions for improvement.
- Design automation framework for automating feature tests.
- Participate in test cases, test plans, s and code reviews.
- Resolve functional queries coming from other business units such as support, escalation, product management, etc.
- Participate in bug trailing, tracking quality assurance metrics.
- Hands-on experience with Python-Selenium or Cypress, will be preferred.
- Familiarity with Test Management systems like XRay and bug tracker like JIRA tools.
What we expect from you:
- 3-10 Years of relevant experience in QA Automation.
- Expert at test automation, creating test plans, test strategies for testing multiple product modules
- Should be able to quickly analyze failures and trace back to issues in the product or the automation suite.
- As a Software Development Engineer in Test you should be an expert at test automation for APIs as well as UI, creating test plans and test strategies for testing product features.
- You will guide and mentor junior team members by reviewing their automation code and test cases to ensure good coverage and quality of a feature
- Resolve functional queries coming from other business units such as support, escalation, product management, etc.
- Be a quick learner and be open to working on new technologies if needed.
- Excellent team player with strong verbal & written communication skills.
- Be able to step up when the situation demands such as meeting deadlines and critical production issues.
- Propose changes or enhancements to the framework for enabling new feature tests.
Few Skills which will add brownie points to your role
- Working knowledge of Dockers and Kubernetes will be an advantage
- Awareness of general manual and automation concepts and all types of testing methods
- Knowledge of the Backup or Storage domain will be an advantage.
If the above fits your skill-sets and tickles your interest then read below about the additional benefits that our company offers to talented folks like you:
Work Culture and Benefits
- Competitive salary and benefits package
(H1-B which means a chance to work onsite out of India) - A culture focused on talent development where you get promoted within the quarterly cycle of your anniversary.
- Opportunity to work with cutting-edge & challenging technologies including legacy tech.
- Open cafeteria to grab some munchies while you work, we make sure the space feels like your second home, you can also wear pyjamas if you like.
- Employee engagement initiatives such as project parties, flexible work hours, and long service awards, team bonding activities within the company, extra learning and personal development trainings, because why stop your learning at one thing!
- Insurance coverage: Group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and your parents. (With some of the best insurance partners in India)
- Enjoy collaborative innovation (each members gets to innovate & think out of the box), along with highly experienced team managers who maintain diversity and work-life well-being.
- And of course, you get to work on projects from some of the most recognised brands within the networking and security space of the world, unlocking global opportunities to learn, grow & contribute in a way that is truly impactful yet purposeful at the same time.
Still not satisfied, and want more proof?
Head to our website https://benisontech.com/">https://benisontech.com to learn more.
at Wissen Technology
at Altimetrik
DevOps Architect
Experience: 10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.
Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired
Technical Skillset: Skills Proficiency level
- Build tools (Ant or Maven) - Expert
- CI/CD tool (Jenkins or Github CI/CD) - Expert
- Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
- Infrastructure As Code (Terraform, Helm charts etc.) - Expert
- Containerization (Docker, Docker Registry) - Expert
- Scripting (linux) - Expert
- Cluster deployment (Kubernetes) & maintenance - Expert
- Programming (Java) - Intermediate
- Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
- Artifactory (JFrog) - Expert
- Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
- Ansible, MySQL, PostgreSQL - Intermediate
• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)
Roles and Responsibilities
• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Global multinational bank
- 8-13 years of relevant experience
- Strong analytical and problem solving skills
- Knowledge of software design, testing and development practices
- Must have hands on experience in Q/KDB+.
- Good knowledge of Python, Java, C++, Shell Script(any one)
- Good knowledge on SQL/PLSQL and databases (any RDBMS)
- Have experience on using Git/Bit bucket
- Good communication skills, both verbal and written.
- Ability to work under pressure, with minimal supervision, and in a team environment
- Detail-oriented and with strong organizational skills
- 4-15 year experience in Application Support
- Must have good knowledge in Java/J2EE, Microservices, PL/SQL, Unix
- Good to have knowledge in Agile, JIRA, Splunk, Service Now
- Good understanding and hands-on experience in Incident, Problem and Change Management
- Provide technical leadership to the team & contribute in the skill development within team
Interact with internal teams and client stakeholders to trouble shoot the tickets/incident and manage other support activities - Good communications skills are necessary, must be team player and inquisitive.
- Strong customer service and support focus with a desire to deliver a high-quality service
- Ability to multi-task, work under pressure and to tight deadlines
- Flexible in working outside of office business hours at short notice (as required)
- Should be able to examine the system and identify the areas for Service Improvements & Value adds.
Leading Global Provider For Secure Data Erasure Solutions
Job Details:
As a Software engineer you will be able to challenge the idea of “impossible”, producing results that are elegant, simple and don’t require a team of experts to decode. You are driven by innovation, fresh ideas and new ways to produce high quality solutions.
Job Description:
Position Summary:
We are looking for a Cloud developer responsible for the development and maintenance of cloud applications deployed in AWS environment. Your primary focus will be the development of such applications and their integration with other services. A commitment to open mind, problem solving, ability to learn, and creating quality products is essential.
Responsibilities:
- Ensure the performance, quality, and responsiveness of services
- Collaborate with a team to define, design, and ship new features
- Innovative thinking of finding solutions to needs
- Identify and correct bottlenecks and fix bugs
- Help maintain code quality, automatization and documentation
- Use Agile Scrum Methodology for software development
- Develop unit tests for all new code
- Provide code reviews for all new code and participate to code reviews of other people
- Diagnose and resolve complex level issues of application
- Participate in interactions with all levels of personnel with different teams
- Design and build services on top of AWS
Skills:
- Strong knowledge of Python
- Strong knowledge of Web Services (Rest or SOAP APIs)
- Strong knowledge of React JS or any other JavaScript
- Solid understanding of object-oriented programming
- Knowledge of Java and Spring Boot is good to have
- Knowledge of AWS is good to have
- Knowledge of TypeScript is good to have
- Knowledge of Linux is good to have
- Knowledge of HTML and CSS is good to have
- Knowledge of AWS CloudFormation is good to have
- Knowledge of Elasticsearch is good to have
- Familiarity with continuous integration
- Any authorized Java, AWS, Linux, or Python certifications will be value added
- Min 2 years of work experience in relevant technologies
- Excellent interpersonal and written communication skills
Experience required - 3 year(minimum) in MERN Stack
Salary - 15-20 LPA
About the Company
The company is one of the fastest-growing B2B SAAS Marketplace to procure industrial materials. The startup is generating an Annual Revenue of Rs. 100 Crore. You will get to work in a fast-paced environment with a brilliant agile tech team led by top engineers.
Location: Pune, 3 months Remote then Work from Office or Hybrid Working
Qualifications & Criteria
1. 3+ Years of development experience in the full-stack development (preferred MERN technology stack). Should be proficient in working with technologies like JavaScript, CSS, and JS with frameworks like NodeJs and ReactJS.
2. Should have good knowledge of databases like Linux.
3. As part of the brains of the startup, You are smart, creative, and love solving business challenges and thereby find new ways to propel the growth of the company.
4. You are passionate about the growth and want to become a future technical leader within the company. Your work and attitude should reflect your commitment for the same.
Responsibilities:
1. Should be able to work independently once proper guidance is provided.
2. Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
3. Ensure timely completion of development activity.
4. Prepare training documents and provide training to internal teams on tools.
5. Consult, design, and develop well-structured, scalable in-house tools.
6. Should be able to write optimized business logic for business functions
• C++ , Unix Environment ( Linux/AIX/HP UX), Oracle/MySQL
• Excellent command on OOPS
• Minimum of 3 years (for Mid and Junior) of hands-on work experience in C++, Unix
• Oracle/MySQL
• Hands-on experience of using data structures, STL, Boost libraries, Design patterns
• Exposure to XML or Edifact is desired
• Exposure to XSLT mappings is a plus
• Excellent troubleshooting skills
• Exposure to CppUnit (or similar tools)
Experience range:
• 4 to 8 years of experience
Joining Location:
• Pune, Gandhinagar & Hyderabad (Preferably Pune & Gandhinagar)
Minimum of 4+ years of experience in Java development
- Experience delivering Services (REST, SOAP) and Web applications in Micro services architecture
- Experience developing and deploying Java solutions to cloud
- Experience in Spring Boot and components of Spring framework
- Experience in a JavaScript framework such as Angular or React
- Experience in TDD using Junit or similar frameworks
· Experience in Design Patterns and service oriented architectural principles, Data structures and Algorithms.
· Individual should be an active participant in the product design and code reviews for self and team and can competently review any aspect of their product or major sub-system.
· Experience in SQL, Unix skills.
· Good communication Skill
Intuitive is the fastest growing top-tier Cloud Solutions and Services company supporting Global Enterprise Customer across Americas, Europe and Middle East.
Intuitive is looking for highly talented hands-on Cloud Infrastructure Architects to help accelerate our growing Professional Services consulting Cloud & DevOps practice. This is an excellent opportunity to join Intuitive’s global world class technology teams, working with some of the best and brightest engineers while also developing your skills and furthering your career working with some of the largest customers.
Job Description :
- Extensive exp. with K8s (EKS/GKE) and k8s eco-system tooling e,g., Prometheus, ArgoCD, Grafana, Istio etc.
- Extensive AWS/GCP Core Infrastructure skills
- Infrastructure/ IAC Automation, Integration - Terraform
- Kubernetes resources engineering and management
- Experience with DevOps tools, CICD pipelines and release management
- Good at creating documentation(runbooks, design documents, implementation plans )
Linux Experience :
- Namespace
- Virtualization
- Containers
Networking Experience
- Virtual networking
- Overlay networks
- Vxlans, GRE
Kubernetes Experience :
Should have experience in bringing up the Kubernetes cluster manually without using kubeadm tool.
Observability
Experience in observability is a plus
Cloud automation :
Familiarity with cloud platforms exclusively AWS, DevOps tools like Jenkins, terraform etc.
We are hiring for Backend Developer for Pune Kharadi.
Experience: Min 2+ Years with Python/Django.
Must have Skills:
- Excellent knowledge of Python/Django code structure.
- Good understanding of Design patterns and OOPS concepts.
- Good understanding of ORM.
- Good understanding of PostgreSQL.
- Better to have code optimization techniques.
- Implementing integrated technology-based solutions and identifying integration opportunities for a similar package of services.
- Excellent knowledge of Linux, Nginx.
- Excellent knowledge of celery, RabbitMQ
- Excellent Knowledge of Git.
Additional skills:
- Knowledge in Docker and Kubernetes is a plus.
- Good to have some knowledge of Angular and some frontend technologies like Html, CSS, Java script.
- Should take responsibility and ownership of delivery.
Education & Qualifications:
- Must have a Graduate/Master’s degree in any vertical or global equivalent from a reputed university.
- Certification in Python/data science would be added advantage.
You will be building using C/C++ and related technologies on mainly Windows platforms.
● Manage priorities, deadlines, and deliverables with your technical expertise
● Research solutions and decide the best and practical solution for complex problems
● Lead designs of major product components, and features
● Design, develop, test, maintain and enhance the product
● Analyze issues reported by customers
● Mentor and train team members on design techniques and technologies
Desired Keyskills -
Relevant experience of 4-7 years in C/C++ development on any platform (Linux, Windows, macOS)
● Experience and skills in designing components and modules
● Experience in the mentoring team for technical skills
● Experience in guiding team for technical needs
● Working proficiency and communication skills in verbal and written English
● Experience in XML, STL, Win32 SDK, Dynamic Library / Shared Library, Process, Multithreading, Windows Messages, ATL, COM, HTTP, File I/O, Memory Mapping, API Hooking, and Memory
Management on Windows or Linux platform
● Experience in Windows System Programming
● Experience in debugging and troubleshooting with using tools like Sysinternals Tools, Debuggers / windbg, API Monitoring / Tracing
● Experience in MS Office & Outlook Object Model
● Experience in Cryptography, Data Security, Information Security and Security Technologies
● Experience in Cross-Platform development
● Experience in building of Desktop software
The key aspects of this role:
• Candidate with exceptional programming skills, problem-solving abilities and strong work
ethic.
• The candidate has to work on custom programming and web application development for
Drupal.
• Testing, maintenance and troubleshooting of existing company sites and resolving issues if
any.
• Contributing ideas and efforts towards internal projects and working as part of a team to
find solutions on various problems.
• Communicate technical ideas to business users and other teams (design, QA).
• Collaborate with team members and to work independently when needed.
• Eager to embrace current and emerging web technologies.
To be the right fit, you'll need:
• Experience of total 4+ years working on PHP and Drupal 7,8 Development (at least 2+ years
in Drupal 8)
• Strong knowledge of MySQL, jQuery, HTML5, CSS
• Experience with Drupal architecture, best practices and coding standards
• Knowledge on Views, Services etc.
• Experience in custom module and theme creation
• Familiar with sub version control systems such as Git or SVN
• Basic knowledge of environment setup for Linux distribution.
Profile – MongoDB Administrator
Experience -2-4 Yrs
Location – Baner, Pune
Job Description-
We are looking for an experienced MongoDB DBA who will maintain MongoDB clusters and databases while optimizing the performance, security, and the availability of MongoDB clusters.
Skills:
- Experience in setting up and managing MongoDB using OpsManager
- Experience with DevOps automation tools
- Experience with Microsoft Azure desirable
- Experience in working with a Linux environment
- Experience in designing indexing and data archival strategies
Responsibilities:
- Maintain and configure MongoDB instances
- Write procedures for backup and disaster recovery
- Assist developers in detecting performance problems
- Ensure that the databases achieve maximum performance and availability
- Upgrade databases through patches
- Implement optimal backup and recovery solution
Opportunity for Unix Developer!!
We at Datametica are looking for talented Unix engineers who would get trained and will get the opportunity to work on Google Cloud Platform, DWH and Big Data.
Experience - 2 to 7 years
Job location - Pune
Mandatory Skills:
Strong experience in Unix with Shell Scripting development.
What opportunities do we offer?
-Selected candidates will be provided training opportunities in one or more of following: Google Cloud, AWS, DevOps Tools and Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- You would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- You will play an active role in setting up the Modern data platform based on Cloud and Big Data
- You would be a part of teams with rich experience in various aspects of distributed systems and computing.
The key aspects of this role:
- Candidate with exceptional programming skills, problem-solving abilities and strong work ethic.
- The candidate has to work on custom programming and web application development for Drupal.
- Testing, maintenance and troubleshooting of existing company sites and resolving issues if any.
- Contributing ideas and efforts towards internal projects and working as part of a team to find solutions on various problems.
- Communicate technical ideas to business users and other teams (design, QA). • Collaborate with team members and to work independently when needed. • Eager to embrace current and emerging web technologies.
To be the right fit, you'll need:
- Experience of total 4+ years working on PHP and Drupal 7,8 Development (at least 2+ years in Drupal 8)
- Strong knowledge of MySQL, jQuery, HTML5, CSS
- Experience with Drupal architecture, best practices and coding standards
- Knowledge on Views, Services etc.
- Experience in custom module and theme creation
- Familiar with sub version control systems such as Git or SVN
- Basic knowledge of environment setup for Linux distribution
• Provides remote planning (design), implementation and/or administrative support on Dell server and storage products involving software. • Performs initial installation, implementation, customization, integration and outline orientation for the customer. • Works closely with other Dell teams, account team and Customer.
Essential Skill Requirements: • Understanding of compute environment eco system • Dell PowerEdge Servers & modular server – planning, implementation and/or administrative • Dell Power Vault – MD / ME4 series storage planning, implementation and/or administrative • Dell Storage – NX, SC Series storage planning, implementation • Experience with basic network switch technologies (Ethernet, Fibre Channel) IP networking and L2 switches. • Operating system installation and configuration: o Windows Server (inclusive of Hyper-V clustering) o VMWare ESXi and virtualization o Red hat Linux • Possesses file, P2V and/or V2V migration experience will have an added advantage.
Desirable Requirements: • Customer Service skill. • Stakeholder management. • Possess excellent problem solving, communication and organizational skills • Flexibility, dependability and have excellent time management skills • Good presentation skills • Analytical, articulate, results-oriented and able to provide excellent follow-ups. • Strong technical aptitude. • Ability to multi-task and influence others to achieve results. • Possesses Professional certification from Cisco/VMWare/Microsoft/Red Hat/Cloud will have an added advantage.
About QuestionPro:
QuestionPro is one of the leading market research platforms. We have a wide range of products in Market Research, Customer Experience, Employee Experience, Vehicle Experience. All our products are multi-tenant SAAS platforms built on the latest technologies.
Our infrastructure is spread across 6 Data Centers across the globe. The platform collects over 10Million Surveys every month. Our Customer Experience platform was named top provider in the Gartner Voice of the Customer Rankings. Ever since we launched in 2016, we have grown by over 200% YoY. All up we are on plan to hit $ 31M in 2021. We are bootstrapped and proud to get where we are without any funding or investments.
Our operations are spread across the globe with offices in the US, Mexico, Germany, UK, UAE, and Canada.
https://www.questionpro.com/blog/cx-top-provider-gartner-voc-rankings/
We are a bootstrapped company and proud to have not taken any funding or investment.
QuestionPro has a particularly exciting journey ahead, requiring a passionate individual to join our growing team. If you are a true technology craftsman and want to build cutting-edge software solutions, hit us!
We operate 100% remote. You will be working from any place you desire for this position.
Responsibilities
- You will be responsible for key deliverables that would help improve the quality and reliability of our infrastructure spread across 8 data centres, both cloud and hybrid.
- Streamline Life Cycle Management activities for the Infrastructure.
- You will be interacting with DevOps & Support teams to quickly investigate and mitigate the problems impacting customers at various levels in the Infrastructure including MySQL Database and Engineered Systems technology stack.
Skills & Requirements
Must-Have:
- 8+ years of experience in Linux System Administration / Development / QA roles with a thorough understanding of Software Development Life Cycle
- Excellent knowledge of Linux System administration activities
- A very good understanding of Linux kernel internals, Server Virtualization, Networking & Security layer
- Any experience in IO subsystem, Operating Systems, Storage technologies would be a definite plus
- Hands-on experience with automation of system administration activities
- Hands on experience in building OS images and testing based on standard test framework.
- Proven ability to triaging and resolve issues during patch testing & certification
- Experience in OEM, Linux OS Patching, Yum, KSplice, etc.
- Excellent Scripting skills in Bash, Perl, Python, or similar scripting languages
Good To Have:
- Good experience on private cloud management.
- Proxmox virtualization
- Ability to build and grow the team.
We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description :
Experience: 6+ Years
Work Location: Pune / Hyderabad
Technical Skills :
- Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
- Knowledge of database performance tuning techniques
- Rich experience in a database development
- Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
- Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
Required Candidate Profile :
- Excellent communication, interpersonal, analytical skills and strong ability to drive teams
- Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
- Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
- Stakeholder management and client engagement skills
- Strong communication skills (written and verbal)
About Us!
A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle Data warehouse Assessment & Migration Planning Product
Raven Automated Workload Conversion Product
Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Looking for folks with a hacker's mindset
We’re seeking a talented developer to help build our platform from the ground up. As a critical member of a dynamic and highly-motivated team, you will have to perform a full stack developers role. And will be responsible for designing & coding new functionality from scratch and maintaining or enhancing the current functionality.
Must haves
- Clear communication skills and a strong customer first attitude
- PHP expert with live project implementations in Laravel framework with MySQL database
- Write clean and well-designed & scalable code
- Maniacal about creating flawless & defect-free delivery
- Troubleshoot, test and maintain the core product code and databases. Refactor or optimize existing code & database where needed
- Maintain a positive and healthy work environment within the team and have fun while doing it
- Contribute in all phases of the development lifecycle right from requirements through till deployment
- Ability to learn and implement automated test tools
We’re seeking a talented developer to help build our platform from the ground up. As a critical member of a dynamic and highly-motivated team, you will take a critical developers role, and will be responsible for designing & coding new functionality from scratch.
Looking for folks with a hacker mindset
- Write “clean” and well-designed & scalable code
- Maniacal about creating flawless & defect-free delivery
- Troubleshoot, test and maintain the core product software and databases to ensure strong optimization and functionality
- Maintain a positive and healthy work environment within the team and have fun while doing it
- Contribute in all phases of the development lifecycle right from requirements through till deployment
• Candidate with exceptional programming skills, problem-solving abilities and strong work
ethic.
• The candidate has to work on custom programming and web application development for
Drupal.
• Testing, maintenance and troubleshooting of existing company sites and resolving issues if
any.
• Contributing ideas and efforts towards internal projects and working as part of a team to
find solutions on various problems.
• Communicate technical ideas to business users and other teams (design, QA).
• Collaborate with team members and to work independently when needed.
• Eager to embrace current and emerging web technologies.
To be the right fit, you'll need:
• Experience of total 4+ years working on PHP and Drupal 7,8 Development (at least 2+ years in Drupal 8)
• Strong knowledge of MySQL, jQuery, HTML5, CSS
• Experience with Drupal architecture, best practices and coding standards
• Knowledge on Views, Services etc.
• Experience in custom module and theme creation
• Familiar with sub version control systems such as Git or SVN
• Basic knowledge of environment setup for Linux distribution
ABOUT US. (https://www.ashnik.com/" target="_blank">https://www.ashnik.com/)
Established in 2009, Ashnik is a leading open-source solutions and consulting company in South East Asia and India, headquartered in Singapore. We enable digital transformation for large enterprises through our design, architecting, and solution skills. Over 100 large enterprises in the region have acknowledged our expertise in delivering solutions using key open-source technologies. Our offerings form critical part of Digital transformation, Big Data platform, Cloud and Web acceleration and IT modernization. We represent EDB, Pentaho, Docker, Couchbase, MongoDB, Elastic, NGINX, Sysdig, Redis Labs, Confluent, and HashiCorp as their key partners in the region. Our team members bring decades of experience in delivering confidence to enterprises in adopting open source software and are known for their thought leadership.
As a team culture, Ashnik is a family for its team members. Each member brings in different perspective, new ideas and diverse background. Yet we all together strive for one goal – to deliver best solutions to our customers using open source software. We passionately believe in the power of collaboration. Through an open platform of idea exchange, we create vibrant environment for growth and excellence.
THE POSITION
Ashnik is looking for talented and passionate people to be part of the team for an upcoming project at client location.
QUALIFICATION AND EXPERIENCE
- Preferably have a working experience of 4 Years and more , on production PostgreSQL DBs.
- Experience of working in a production support environment
- Engineering or Equivalent degree
- Passion for open-source technologies is desired
ADDITIONAL SKILLS
- Install & Configure PostgreSQL, Enterprise DB
- Technical capabilities PostgreSQL 9.x, 10.x, 11.x
- Server tuning
- Troubleshooting of Database issues
- Linux Shell Scripting
- Install, Configure and maintain Fail Over mechanism
- Backup - Restoration, Point in time database recovery
- A demonstrable ability to articulate and sell the benefits of modern platforms, software and technologies.
- A real passion for being curious and a continuous learner. You are someone that invests in yourself as much as you invest in your professional relationships.
RESPONSIBILITIES
- Monitoring database performance
- Optimizing Queries and handle escalations
- Analyse and assess the impact and risk of low to medium risk changes on high profile production databases
- Implement security features
- DR implementation and switch over
LOCATION: Pune
Experience: 2 yrs plus
ABOUT US.
Established in 2009, Ashnik is a leading open-source solutions and consulting company in South East Asia and India, headquartered in Singapore. We enable digital transformation for large enterprises through our design, architecting, and solution skills. Over 100 large enterprises in the region have acknowledged our expertise in delivering solutions using key open-source technologies. Our offerings form critical part of Digital transformation, Big Data platform, Cloud and Web acceleration and IT modernization. We represent EDB, Pentaho, Docker, Couchbase, MongoDB, Elastic, NGINX, Sysdig, Redis Labs, Confluent, and HashiCorp as their key partners in the region. Our team members bring decades of experience in delivering confidence to enterprises in adopting open source software and are known for their thought leadership.
THE POSITION
Ashnik is looking for talented and passionate people to be part of the team for an upcoming project at client location.
QUALIFICATION AND EXPERIENCE
- Preferably have a working experience of 2 Years and more , on production PostgreSQL DBs.
- Experience of working in a production support environment
- Engineering or Equivalent degree
- Passion for open-source technologies is desired
ADDITIONAL SKILLS
- Install & Configure PostgreSQL, Enterprise DB
- Technical capabilities PostgreSQL 9.x, 10.x, 11.x
- Server tuning
- Troubleshooting of Database issues
- Linux Shell Scripting
- Install, Configure and maintain Fail Over mechanism
- Backup - Restoration, Point in time database recovery
- A demonstrable ability to articulate and sell the benefits of modern platforms, software and technologies.
- A real passion for being curious and a continuous learner. You are someone that invests in yourself as much as you invest in your professional relationships.
RESPONSIBILITIES
- Monitoring database performance
- Optimizing Queries and handle escalations
- Analyse and assess the impact and risk of low to medium risk changes on high profile production databases
- Implement security features
- DR implementation and switch over
LOCATION: Pune & Bangalore
Experience: 2 yrs plus
Package: upto 10 LPA
We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 4-10 years
Location : Pune
Mandatory Skills -
- Strong in ETL/SQL development
- Strong Data Warehousing skills
- Hands-on experience working with Unix/Linux
- Development experience in Enterprise Data warehouse projects
- Good to have experience working with Python, shell scripting
Opportunities -
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
http://www.datametica.com/">www.datametica.com
Datametica is Hiring for Datastage Developer
- Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
Experience - 2 to 6 Years
Work Location - Pune
Datametica is looking for talented SQL engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.
Mandatory Skills:
- Strong in SQL development
- Hands-on at least one scripting language - preferably shell scripting
- Development experience in Data warehouse projects
Opportunities:
- Selected candidates will be provided learning opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume, and KafkaWould get a chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing