11+ Blade servers Jobs in India
Apply to 11+ Blade servers Jobs on CutShort.io. Find your next job, effortlessly. Browse Blade servers Jobs and apply today!
• Provides remote planning (design), implementation and/or administrative support on Dell server and storage products involving software. • Performs initial installation, implementation, customization, integration and outline orientation for the customer. • Works closely with other Dell teams, account team and Customer.
Essential Skill Requirements: • Understanding of compute environment eco system • Dell PowerEdge Servers & modular server – planning, implementation and/or administrative • Dell Power Vault – MD / ME4 series storage planning, implementation and/or administrative • Dell Storage – NX, SC Series storage planning, implementation • Experience with basic network switch technologies (Ethernet, Fibre Channel) IP networking and L2 switches. • Operating system installation and configuration: o Windows Server (inclusive of Hyper-V clustering) o VMWare ESXi and virtualization o Red hat Linux • Possesses file, P2V and/or V2V migration experience will have an added advantage.
Desirable Requirements: • Customer Service skill. • Stakeholder management. • Possess excellent problem solving, communication and organizational skills • Flexibility, dependability and have excellent time management skills • Good presentation skills • Analytical, articulate, results-oriented and able to provide excellent follow-ups. • Strong technical aptitude. • Ability to multi-task and influence others to achieve results. • Possesses Professional certification from Cisco/VMWare/Microsoft/Red Hat/Cloud will have an added advantage.
Join us at Springer Capital, a corporate inclusion training company dedicated to promoting diversity and equity within the workplace.
Our mission is to transform organizational cultures and achieve justice by creating environments where every individual is valued and feels a sense of belonging. Through providing training for workplace inclusion, understanding microaggressions, mitigating bias, and cultural literacy, Springer Capital seeks to eliminate bias and establish a just, fair working environment.
As a Data Automation Intern, you will be focusing on researching and developing tools and workflow that automate some parts and processes of our business automation. Since business automation is important throughout the firm, you will have the opportunity to collaborate with various teams across Springer Capital.
Key Responsibilities:
- Collect data from various sources, including databases, APIs, and web scraping tools.
- Clean and process raw data to ensure it is accurate and consistent.
- Analyze data to extract insights using computational tools, such as Excel, SQL, and Python.
- Communicate insights in a clear and concise manner with the manager alongside your progress.
- Implement solutions based on insights you discovered to improve Springer Capital business’ processes or solve problems for client.
Qualifications:
- Passion for Inclusion: A strong commitment to fighting inequality and promoting inclusion in the workplace.
- Educational Background: Currently enrolled in or recently graduated from a degree program related to social sciences, human resources, business, statistics.
- Communication Skills: Excellent written and verbal communication skills. Ability to create clear and engaging content.
- Organizational Skills: Detail-oriented and highly organized. Ability to manage multiple tasks and deadlines effectively.
- Analytical and Business Software Skills: Proficiency in business software such as Excel, PowerPoint, and Word with a strong preference of knowledge and experience in data analytics
Role & Responsibilities
- Create innovative architectures based on business requirements.
- Design and develop cloud-based solutions for global enterprises.
- Coach and nurture engineering teams through feedback, design reviews, and best practice input.
- Lead cross-team projects, ensuring resolution of technical blockers.
- Collaborate with internal engineering teams, global technology firms, and the open-source community.
- Lead initiatives to learn and apply modern and advanced technologies.
- Oversee the launch of innovative products in high-volume production environments.
- Develop and maintain high-quality software applications using JS frameworks (React, NPM, Node.js etc.,).
- Utilize design patterns for backend technologies and ensure strong coding skills.
- Deploy and manage applications on AWS cloud services, including ECS (Fargate), Lambda, and load balancers. Work with Docker to containerize services.
- Implement and follow CI/CD practices using GitLab for automated build, test, and deployment processes.
- Collaborate with cross-functional teams to design technical solutions, ensuring adherence to Microservice Design patterns and Architecture.
- Apply expertise in Authentication & Authorization protocols (e.g., JWT, OAuth), including certificate handling, to ensure robust application security.
- Utilize databases such as Postgres, MySQL, Mongo and DynamoDB for efficient data storage and retrieval.
- Demonstrate familiarity with Big Data technologies, including but not limited to:
- Apache Kafka for distributed event streaming.
- Apache Spark for large-scale data processing.
- Containers for scalable and portable deployments.
Technical Skills:
- 7+ years of hands-on development experience with JS frameworks, specifically MERN.
- Strong coding skills in backend technologies using various design patterns.
- Strong UI development skills using React.
- Expert in containerization using Docker.
- Knowledge of cloud platforms, specifically OCI, and familiarity with serverless technology, services like ECS, Lambda, and load balancers.
- Proficiency in CI/CD practices using GitLab or Bamboo.
- Strong knowledge of Microservice Design patterns and Architecture.
- Expertise in Authentication and authorization protocols like JWT, and OAuth including certificate handling.
- Experience working with high stream media data.
- Experience working with databases such as Postgres, MySQL, and DynamoDB.
- Familiarity with Big Data technologies related to Kafka, PySpark, Apache Spark, Containers, etc.
- Experience with container Orchestration tools like Kubernetes.
Responsibilities:
.Create exclusive distributors, exclusive super stockiest and business associates for the company towards business growth.
.Guide, coordinate and make strategic marketing plans for the sales team working under his jurisdiction.
.Finding and developing new markets and improving sales.
.Keep observing stock reports, inventory, product orders, re-order etc.
Requirements:
.Bachelor's degree in business, marketing or a related field.
.Experience in sales, marketing or related field.
.Strong communication skills.
.Proficient in Word, Excel, Outlook, and power point.
You need to write scalable Golang code for developing and implementing robust applications. You would indulge your dedication and passion to build server-side logic ensuring low-latency and high-end performance. You should have sound knowledge of Kubernetes, Docker, Microservices.
YOUR ‘OKR’ SUMMARY
OKR means Objective and Key Results.
As a Senior Development Engineer at Coredge, you will help develop our next-generation cloud native core
solution along with the product and the open-source community to build the Coredge.io vision
What you will do?
- System engineering and implementation in Golang.
- Working on performance issues using creative experiments and internally developed product features.
- Research, propose, and integrate relevant open-source projects based on product objectives.
- Write organized, efficient, and well-documented Python/Golang code as an example for junior engineers.
- Participation in all levels of product definition, design, implementation, testing, and deployment.
- Must include the ability to discuss abstract system architectures from ideas through implementation and
creatively apply domain experience to solve technical challenges.
- Mentoring software engineers, fostering an environment of trust and accountability.
What you will need?
A strong sense of ownership, urgency, and drive. As an integral part of the development team, you will need the
following skills to succeed.
- Strong Golang skills to develop a framework(s).
- Hands-on to design & develop re-usable framework components.
- Hands-on experience in developing the framework, designing re-usable framework components.
- Experience in engineering practices such as code refactoring, design patterns, design-driven development, Continuous Integration, building highly scalable applications, application security, and functional programming.
Additional Skills:
- Knowledge of Cloud-native would be an advantage.
- Understanding of Kubernetes from the Architecture side and also understanding the Standard API.
- Code contributed to CNCF or a similar community will be a plus.
- Performance benchmarking of K8’s or any cloud will be added advantage
Additional Advantage:
- Deep understanding of technology and passion about what you do.
- Background in designing high performant scalable software systems with a strong focus to optimize hardware cost.
- Solid collaborative and interpersonal skills, specifically a proven ability to effectively guide and influence within a dynamic environment.
- Strong commitment to get the most performance out of a system being worked on.
- Prior development of a large software project using service-oriented architecture operating with real time constraint.
What's In It for You?
- You will get a chance to work on cloud-native and hyper-scale products
- You will be working with industry leaders in cloud.
- You can expect a steep learning curve.
- You will get the experience of solving real time problems, eventually you become a problem solver.
Benefits & Perks:
- Competitive Salary
- Health Insurance
- Open Learning - 100% Reimbursement for online technical courses.
- Fast Growth - opportunities to grow quickly and surely
- Creative Freedom + Flat hierarchy
- Sponsorship to all those employees who represent company in events and meet ups.
- Flexible working hours
- 5 days week
- Hybrid Working model (Office and WFH)
Our Hiring Process:
Candidates for this position can expect the hiring process as follows (subject to successful clearing of every round)
- Initial Resume screening call with our Recruiting team
- Next, candidates will be invited to solve coding exercises.
- Next, candidates will be invited for first technical interview
- Next, candidates will be invited for final technical interview
- Finally, candidates will be invited for Culture Plus interview with HR
- Candidates may be asked to interview with the Leadership team
- Successful candidates will subsequently be made an offer via email
As always, the interviews and screening call will be conducted via a mix of telephonic and video call.
So, if you are looking at an opportunity to really make a difference- make it with us…
Coredge.io provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by applicable central, state or local laws.
- Plan, manage, implement, support and/or monitor SAP integration solutions and middleware for on-premise, cloud or hybrid scenarios
- Plan and conduct customer workshops, design and build integrations and interfaces, create plans for data Integration and Unit testing
- Communicating effectively with project team members at different technical knowledge levels and often remotely across diverse geographical locations
- Identify test data for unit testing, integration testing and user acceptance, working with client's functional consultants for validation of test results
- Build relationships with your clients to help Accenture become the go-to partner for SAP data integration and interfaces
- Manages small-medium sized teams and/or work efforts at a client
• Experience in PEGA and/or TIBCO BPM systems is a plus
• Worked with very complex workflows, asynchronous tasks, user tasks, event listeners and Business Central deployments and APIs
• Ability to configure BPM workflows as per client need. Experience at client location is preferred
• Strong knowledge of BPMN2.0, DMN and CMMN. Hands on workflow configuration is preferred
• Good knowledge on CI/CD, DevOps, Scrum practices
• Ability to adapt and work in an agile fast paced environment
• Collaborates with multiple teams of developers/BA’s/Designers to implement project specifications, providing workflow support and technical guidance to less experienced team members
• Very good analytical, problem solving ability, verbal and written communication skills & expertise in client demos
environments
Create and modify database schema, T-SQL scripts, stored procedures and other database
objects and SSIS packages in order to meet customer requirements.
Must have strong working knowledge of MS SQL Management Studio
Set up, maintenance and support jobs in SQL Agent
Required Skills:
MS SQL Server 2012/2016 and T-SQL in OLAP and OLTP
Operating Systems: Windows Server 2008/2012/2016
Database: MS SQL Server 2016/2019
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
You must possess good communication skills, .net and C# development skills, Server connectivity.
The person should have the capability to design solutions and can handle projects or lead projects individually if senior is not available.
A great opportunity to work grow and visit different countries waiting for you.





