11+ DSLAM Jobs in Hyderabad | DSLAM Job openings in Hyderabad
Apply to 11+ DSLAM Jobs in Hyderabad on CutShort.io. Explore the latest DSLAM Job opportunities across top companies like Google, Amazon & Adobe.

Mandatory skills:
- Programming skills in Python, Robot framework, Selenium, Shell scripting
- Experience on L2/L3 protocols of VLAN/DHCP/LACP/IGMP/PPPoE.
- Should be familiar with device configuration protocols of CLI/NETCONF/SNMP.
- Experience in telecom technologies like DSLAM/GPON/G.fast/Next gen broadband technologies is highly recommended
- Knowledge on regression/performance/load/scale/stability test areas
- Hands on experience with Common industry equipment like Spirent test center/ixia/Abacus/Shenick(TeraVM)/N2X Traffic generators.
- Exposure to debug tools such as Wireshark/tcpdump.
Job Requirements:
- Knowledge on software Test cycle, test plan and test case creation
- Understanding of End to end test setup topology and debugging.
- Ability to perform System level Functional and Non-functional tests.
- Familiar with Manual test life cycle
- Designing and writing test automation scripts using automation frameworks
- Exposure to CI/CD pipe line implementation and maintenance using Jenkins, Groovy scripting
- Linux skills (system configuration and administration, containers, networking experience and d) such as sockets and database management
- Good debugging skills and knowledge on various debug tools
- Bug reporting/tracking and providing logs.
Overview:
We are seeking a talented and experienced GCP Data Engineer with strong expertise in Teradata, ETL, and Data Warehousing to join our team. As a key member of our Data Engineering team, you will play a critical role in developing and maintaining data pipelines, optimizing ETL processes, and managing large-scale data warehouses on the Google Cloud Platform (GCP).
Responsibilities:
- Design, implement, and maintain scalable ETL pipelines on GCP (Google Cloud Platform).
- Develop and manage data warehouse solutions using Teradata and cloud-based technologies (BigQuery, Cloud Storage, etc.).
- Build and optimize high-performance data pipelines for real-time and batch data processing.
- Integrate, transform, and load large datasets into GCP-based data lakes and data warehouses.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Write efficient, clean, and reusable code for ETL processes and data workflows.
- Ensure data quality, consistency, and integrity across all pipelines and storage solutions.
- Implement data governance practices and ensure security and compliance of data processes.
- Monitor and troubleshoot data pipeline performance and resolve issues proactively.
- Participate in the design and implementation of scalable data architectures using GCP services like BigQuery, Cloud Dataflow, and Cloud Pub/Sub.
- Optimize and automate data workflows for continuous improvement.
- Maintain up-to-date documentation of data pipeline architectures and processes.
Requirements:
Technical Skills:
- Google Cloud Platform (GCP): Extensive experience with BigQuery, Cloud Storage, Cloud Dataflow, and Cloud Composer.
- ETL Tools: Expertise in building ETL pipelines using tools such as Apache NiFi, Apache Beam, or custom Python-based scripts.
- Data Warehousing: Strong experience working with Teradata for data warehousing, including data modeling, schema design, and performance tuning.
- SQL: Advanced proficiency in SQL and relational databases, particularly in the context of Teradata and GCP environments.
- Programming: Proficient in Python, Java, or Scala for building and automating data processes.
- Data Architecture: Knowledge of best practices in designing scalable data architectures for both structured and unstructured data.
Experience:
- Proven experience as a Data Engineer, with a focus on building and managing ETL pipelines and data warehouse solutions.
- Hands-on experience in data modeling and working with complex, high-volume data in a cloud-based environment.
- Experience with data migration from on-premises to cloud environments (Teradata to GCP).
- Familiarity with Data Lake concepts and technologies.
- Experience with version control systems like Git and working in Agile environments.
- Knowledge of CI/CD and automation processes in data engineering.
Soft Skills:
- Strong problem-solving and troubleshooting skills.
- Excellent communication skills, both verbal and written, for interacting with technical and non-technical teams.
- Ability to work collaboratively in a fast-paced, cross-functional team environment.
- Strong attention to detail and ability to prioritize tasks.
Preferred Qualifications:
- Experience with other GCP tools such as Dataproc, Bigtable, Cloud Functions.
- Knowledge of Terraform or similar infrastructure-as-code tools for managing cloud resources.
- Familiarity with data governance frameworks and data privacy regulations.
- Certifications in Google Cloud or Teradata are a plus.
Benefits:
- Competitive salary and performance-based bonuses.
- Health, dental, and vision insurance.
- 401(k) with company matching.
- Paid time off and flexible work schedules.
- Opportunities for professional growth and development.
Application development:
Development of forms and reports.
Creation of SQL packages, functions and procedures etc.
Should have extensively worked on oracle EBS implementation/ upgrade/ support projects/ tools JIRA.
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 3alm+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team
About Mudrantar Solutions Private Limited
Mudrantar Solutions Pvt. Ltd. is a wholly owned subsidiary of the US based startup, Mudrantar Corporation. Mudrantar is a well-funded startup focused on disruptive changes in the Accounting Software for Small, Medium as well as Large businesses in India. Our state-of-the-art OCR + Machine Learning Technology allows customers to simply take photos and our software does the rest of the heavy lifting. Our strategy for CAs, CS and Tax Practitioners is realized through web access for our customers to manage their practice and also manage client communication through freely available mobile app. We also offer data entry automation services through AI/ML platform.
HR Associate
As HR Associate in the IT Startup, you will use your unique blend of HR and Communication skills to recruit top talent in IT Industry as well as retain employees. In this role, you will be responsible for obtaining and recording HR information, managing the HR database, and assisting company employees with enrollment procedures and HR-related issues.
Position
- Full time employment
Location
- Hyderabad or Pune (preferred)
- Any location in India
Requirements
- Recruitment end to end
- On Boarding, Induction
- Learning & Development
- Employee Engagement
- Personal Record Management in ERP
- Attendance & Payroll Assistance
- Employee Retention, Exit Interviews & formalities
Salary:
₹350,000.00 - ₹500,000.00 per year
Benefits:
- Health insurance
- Paid sick time
- Paid time off
- Work from home
Education:
- Master's (Preferred)
Experience:
- Human Resources Generalist: 2+ years (Preferred)
- recruitment: 2+ years (Preferred)
- HRIS: 2+ years (Preferred)



Our client is looking for a Senior/Lead Full Stack .Net Developer for their headquarter office which is located in San Francisco, California. Looking for top-notch developers who are passionate and well-versed with modern technologies like .Net Core, MVC, SQL, and JS frameworks.
This position gives you an opportunity to lead exciting projects, work directly with teams in the US and with super smart people that you can learn from, and contribute to their knowledge.
Requirement:
- 7-15 years of experience in software development.
- Hands-on design and coding are required.
- Expertise in React.JS.
- Good knowledge of database concepts and Microsoft SQL Server.
- Highly proficient in modern JavaScript, HTML, CSS, ReactJs, and in one or more libraries for state management (e.g. Redux)
- Solid foundation in computer science, with strong competencies in data structures, design patterns, concurrency, algorithms, and software design
- Research and evaluate new software, frameworks, and techniques to provide recommendations to the division.
- Design and develop robust and scalable software components.
- Strong analytical and troubleshooting skills.
- Bachelor's degree in Computer Science or a technical field.
Perks & Benefits:
- Friendly, talented, collaborative, and entrepreneurial team
- Competitive and comprehensive benefits and perks
- Generous holiday and PTO policies
- Training and development opportunities and allowance
- Fun and inclusive digital, and (in the future) in-person events
- Employee groups - DEI committee, fun committee, wellness group, and more
- Flexible remote work
About company
The company is the industry-leading collection of online destinations (including Astrology.com, Horoscope.com, Keen, and PsychicCenter) providing spiritual guidance on love, relationships, career, health, and life overall. They are passionate about connecting people with the world's best advisors (including psychics, tarot card readers, life coaches, and more) and content to empower everyone to live happier lives.
A team of over 100 employees is powered by their diverse perspectives and company core values:
- They are humble. They believe the best result is achieved by leveraging others perspectives
- They think like owners. They make decisions that optimize for the greater good of the organization
- They challenge limiting beliefs. They are at their best when they identify and shatter the status quo expectations.
The company is an equal opportunity workplace and is an affirmative action employer. They are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status.
Title: Salesforce Developer/ Architect
Location: Hyderabad/Vizag
Experience: 4+ years
Required Skills:
4+ years of enterprise consulting experience in spanning large organizations and enterprise platforms of Salesforce.
4+ years of experience as a Senior Architect and/or technology leader in a mid-to-large sized organization.
min 1 or 2 projects in Non-Profit Or Architect with Salesforce community implementation with more than 10K users.
Involved in minimum 4 end-to-end implementation Involved in minimum 1 end-to-end community cloud implementation
Experience with key areas of enterprise architecture including Cloud, Integration Technologies, Master Data Management.
Hands-on experience with Agile Implementation Frameworks.
Proven ability to analyze, design, and optimize business processes via out-of-the-box product, technology and integration.
Success in guiding customers and colleagues in rationalizing and deploying emerging technologies that drive increased business value.
Excellent communication skills, both written and verbal. Able to effectively develop materials that are appropriate for the audience.
Able to multitask in a fast paced environment and exercise a high degree of initiative in resolving issues and developing process enhancement recommendations.
Excellent team player able to work with virtual and global cross functional teams and lead through influence.
Salesforce Certified Nonprofit Cloud Consultant Lead a team of 5-8 developers
Prepare Technical Design based on business requirements for SALES,Service Cloud Perform Build and Unit test of Salesforce modules based on technical designs, following best practices, ensuring expected code quality
Conduct peer review, guide team members Status reporting of the tasks complete by team as per agreed schedule
Must be proficient with Salesforce Apex, lightning,LWC and batch classes.





