
- Key responsibility is to design and develop a data pipeline including the architecture, prototyping, and development of data extraction, transformation/processing, cleansing/standardizing, and loading in Data Warehouse at real-time/near the real-time frequency. Source data can be structured, semi-structured, and/or unstructured format.
- Provide technical expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
- Development of complex data transformation using Talend (BigData edition), Python/Java transformation in Talend, SQL/Python/Java UDXs, AWS S3, etc to load in OLAP Data Warehouse in Structured/Semi-structured form
- Development of data model and creating transformation logic to populate models for faster data consumption with simple SQL.
- Implementing automated Audit & Quality assurance checks in Data Pipeline
- Document & maintain data lineage to enable data governance
- Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
Requirements
- Programming experience using Python / Java, to create functions / UDX
- Extensive technical experience with SQL on RDBMS (Oracle/MySQL/Postgresql etc) including code optimization techniques
- Strong ETL/ELT skillset using Talend BigData Edition. Experience in Talend CDC & MDM functionality will be an advantage.
- Experience & expertise in implementing complex data pipelines, including semi-structured & unstructured data processing
- Expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
- Good understanding & working experience in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) and cloud-native Data Lake (S3, ADLS, BigQuery, etc) solutions
- Familiarity with AWS tool stack for Storage & Processing. Able to recommend the right tools/solutions available to address a technical problem
- Good knowledge of database performance and tuning, troubleshooting, query optimization, and tuning
- Good analytical skills with the ability to synthesize data to design and deliver meaningful information
- Good knowledge of Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model
- Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
- Ability to understand business functionality, processes, and flows
- Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently
Functional knowledge
- Data Governance & Quality Assurance
- Distributed computing
- Linux
- Data structures and algorithm
- Unstructured Data Processing

Similar jobs
Dear Candidate,
We are urgently Hiring QA-Manual Testers for Hyderabad Location.
Position: QA Tester-Manual
Location: Hyderabad
Experience: 5-8 yrs
Salary: Best in Industry (20-25% Hike on the current ctc)
Note:
only Immediate to 15 days Joiners will be preferred.
Candidates from Tier 1 companies will only be shortlisted and selected
Candidates' NP more than 30 days will get rejected while screening.
Offer shoppers will be rejected.
Best Regards,
Minakshi Soni
Executive - Talent Acquisition (L2)
Rigel Networks
Worldwide Locations: USA | HK | IN

- Should be able to write APIs and connect with external systems
- Should be able to develop plugins or modify existing plugins
Note: This is not a WordPress management role. Development experience is a must.
The Knowledge Graph Architect is responsible for designing, developing, and implementing knowledge graph technologies to enhance organizational data understanding and decision-making capabilities. This role involves collaborating with data scientists, engineers, and business stakeholders to integrate complex data into accessible and insightful knowledge graphs.
Work you’ll do
1. Design and develop scalable and efficient knowledge graph architectures.
2. Implement knowledge graph integration with existing data systems and business processes.
3. Lead the ontology design, data modeling, and schema development for knowledge representation.
4. Collaborate with IT and business units to understand data needs and deliver comprehensive knowledge graph solutions.
5. Manage the lifecycle of knowledge graph data, including quality, consistency, and updates.
6. Provide expertise in semantic technologies and machine learning to enhance data interconnectivity and retrieval.
7. Develop and maintain documentation and specifications for system architectures and designs.
8. Stay updated with the latest industry trends in knowledge graph technologies and data management.
The Team
Innovation & Technology anticipates how technology will shape the future and begins building future capabilities and practices today. I&T drives the Ideation, Incubation and scale of hybrid businesses and tech enabled offerings at prioritized offering portfolio and industry interactions.
It drives cultural and capability transformation from solely services – based businesses to hybrid businesses. While others bet on the future, I&T builds it with you.
I&T encompasses many teams—dreamers, designers, builders—and partners with the business to bring a unique POV to deliver services and products for clients.
Qualifications and Experience
Required:
1. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
2. 6-10 years of professional experience in data engineering with Proven experience in designing and implementing knowledge graph systems.
3. Strong understanding of semantic web technologies (RDF, SPARQL, GraphQL,OWL, etc.).
4. Experience with graph databases such as Neo4j, Amazon Neptune, or others.
5. Proficiency in programming languages relevant to data management (e.g., Python, Java, Javascript).
6. Excellent analytical and problem-solving abilities.
7. Strong communication and collaboration skills to work effectively across teams.
Preferred:
1. Experience with machine learning and natural language processing.
2. Experience with Industry 4.0 technologies and principles
3. Prior exposure to cloud platforms and services like AWS, Azure, or Google Cloud.
4. Experience with containerization technologies like Docker and Kubernetes
Please find the below JD Cisco Engineer(N+C)+Cisco Engg (R&S)+SdWan
Up to 10 years of hands-on experience in managing LAN, WAN, SDWAN, DC Networking, WiFi.
• Strong understanding of TCP/IP, routing protocols, L2/L3 switches, Wi-Fi 802.11 and SD-WAN networks experience.
• Experience in Implementation and Troubleshooting of Wireless controller, Access Points, 802.1x, Wi-Fi protocols (802.11a/b/g/n/ae/ac)
• Good understanding of VLAN, VTP, DTP, 802.1Q trunk, STP, MSTP, ACL, SNMP config.
• Experience of working on Routers and Switches (Cisco ASR, ISR , Catalyst, Nexus )
• Experience in configuring and troubleshooting BGP, OSPF, IS-IS, EIGRP and static routes.
• Experience in configuring and troubleshooting STP, MSTP, RSTP, VSTP, HSRP, DHCP.
• Should have experience of configuration and management of Cisco viptela SDWAN solution.
• Prior experience in Implementation and migration projects involving above-mentioned technologies is mandatory.
• Strong troubleshooting and problem-solving skills.
• Positive, communicative, and customer-oriented attitude.

Job Description :
*4+ years of software design and development experience.
*Programming experience building .net web applications in .net core/.net 6
*Must have strong knowledge in any Database.
*AWS and Azure experience will be an added advantage.
*The candidate must have developed a browser-based application and used web services and APIs.
*Good communication skills (both verbal and written)
*Able to prioritize and deliver project activities on time
*Good understanding of client requirements.
Profile:- Field Business Development Executive
Location: Noida
Department – Sales
Qualification: Must be a Graduate (MBA preferred ).
Compensation :
Full Time (for 8 hours) : Up to 25 K (Remote City)
Purpose of the Role –
® Achieve lasting customer success and higher profitability
® Leverage Justdial’s strength in creating clients’ campaigns where he gets a maximum benefit at the most cost-effective investments
® Drive sales and increasing the number of client acquisitions and thereby generating revenue for our organization
Key Responsibilities –
1. Meet commercial establishments that are located within the area allocated to him/her and adhere to the following steps:
a. Present the business offering and explain the benefits of the brand to the prospective clients.
b. Provide a demo and explain the advantages and features of the services.
c. Collect qualitative business information of the business enterprises he/she visits and also explain how the same would be represented to the users of Justdial.
d. Explain the contract, its feature, tenure and all terms and conditions to customer in detail.
e. Answer the queries raised by the customers.
f. Persuade the business owners/managers to register with Justdial as paid customers, which would enhance their business.
g. Upload geo-coded photos to update the profiles on the Justdial database.
2. Send Key Parameter Monitor (KPM) reports to the reporting managers daily.
3. Submit the contracts to the office with proper documentation.
4. Ensure that the contract cheque is cleared and the account of the client is activated within Justdial.
5. Adhere to the compliance and policies set by the department.
Skills and Work Experience Required:
1. Language Proficiency - Fluent in English, Hindi or any other regional language
2. Communication Skill - Good communication skills to explain the services of Justdial and also able to comprehend queries and doubts of clients; helping them to handle the clients
3. Other requirements –
® Ability to work under pressure
® Ability to comprehend and follow instructions and directions
Job Summary:
We want a techie at heart. Someone who is happy and curious to work on all aspects of software development.
Reporting directly to the CTO, you will be responsible for feature design, development, and continuously optimizing our tech stack.
- We are looking for an experienced software engineer with at least 5 years of experience in a startup or product environment. Ideally you have been involved in all aspects of software development from requirements gathering to design, development, deployment and post-release support. We are looking for all-round technical maturity. Our tech stack is Angular, Spring boot and Django/Python.
Key Skills
Java
SpringBoot
PostgreSQL/MySQL
Git
AWS
REST api design
Experience integrating with external APIs
Good applied understanding of Object Oriented Programming
Good database modeling and SQL knowledge.
/React is a big plus.
Responsibilities and Duties
Build out features across the stack : backend, API design and integration, database optimization , microservices, plugins, queues etc
Fix bugs and write automated tests
Maintain and upgrade our Tech Stack
Translate requirements to design and write/present articulate software design.


• This is an individual contributor role and the incumbent need to depict deep technical skills for
developing Cloud-based web applications on Microsoft technologies.
• Ability to convert requirements into Tasks and execute them. Troubleshoot and resolve
application issues and identify deficiencies. Participate within the "Scrum" process.
• Understand and apply the principles of software engineering and display software
craftsmanship.
• Elicit requirements from end-users and stakeholders as needed. Work with the QA to construct
system and integration test plans.
Must-Have
• 3+ years of hands-on experience with software applications development Proficient in C#,
ASP.Net, MVC, WCF, LINQ, jQuery & SQL Server.MVC with .Net experience - Database (SQL), NTT
Framework experience
• WCF & Web API experience
• Third-Party Integrations
• Strong knowledge Of OOP principle
• Familiar with JavaScript and jQuery
• Good understanding of Angular
• Must have worked on multi tenancy
• Familiar with agile SDLS Good to have
• Server and hosting knowledge
• Skill to write dev unit test case
• Behavioural Requirement
• Good Communication Skills
• Should be a good listener
• Can articulate well
• High on accountability
Education:
Engineering Graduate/Postgraduate in Computer Science
We are looking for an experienced engineer with superb technical skills. You will primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions. The successful candidate will be curious, creative, ambitious, self motivated, flexible, and have a bias towards taking action. As part of the early engineering team, you will have a chance to make a measurable impact in future of Thinkdeeply as well as having a significant amount of responsibility.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Experience
12+ Years
Location
Hyderabad
Skills
Bachelors/Masters/Phd in CS or equivalent industry experience
10+ years of industry experience in java related frameworks such as Spring and/or Typesafe
Experience with scripting languages. Python experience highly desirable. 5+ Industry experience in python
Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
Demonstrated expertise of building and shipping cloud native applications
Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
Experience in API development using Swagger
Strong expertise with containerization technologies including kubernetes, docker-compose
Experience with cloud platform services such as AWS, Azure or GCP.
Implementing automated testing platforms and unit tests
Proficient understanding of code versioning tools, such as Git
Familiarity with continuous integration, Jenkins
Responsibilities
Architect, Design and Implement Large scale data processing pipelines
Design and Implement APIs
Assist in dev ops operations
Identify performance bottlenecks and bugs, and devise solutions to these problems
Help maintain code quality, organization, and documentation
Communicate with stakeholders regarding various aspects of solution.
Mentor team members on best practices



