
Risk Resources LLP hyd
https://www.riskresourcesindia.comJobs at Risk Resources LLP hyd


Desired Competencies (Technical/Behavioral Competency)
Must-Have 1. Experience in working with various ML libraries and packages like Scikit learn, Numpy, Pandas, Tensorflow, Matplotlib, Caffe, etc.
2. Deep Learning Frameworks: PyTorch, spaCy, Keras
3. Deep Learning Architectures: LSTM, CNN, Self-Attention and Transformers
4. Experience in working with Image processing, computer vision is must
5. Designing data science applications, Large Language Models(LLM) , Generative Pre-trained Transformers (GPT), generative AI techniques, Natural Language Processing (NLP), machine learning techniques, Python, Jupyter Notebook, common data science packages (tensorflow, scikit-learn,kerasetc.,.) , LangChain, Flask,FastAPI, prompt engineering.
6. Programming experience in Python
7. Strong written and verbal communications
8. Excellent interpersonal and collaboration skills.
Good-to-Have 1. Experience working with vectored databases and graph representation of documents.
2. Experience with building or maintaining MLOpspipelines.
3. Experience in Cloud computing infrastructures like AWS Sagemaker or Azure ML for implementing ML solutions is preferred.
4. Exposure to Docker, Kubernetes
Mainframe Developer Job Description
A Mainframe Developer is responsible for designing, developing, and maintaining software applications on mainframe systems. Here's a brief overview:
Key Responsibilities
- Application Development: Develop and maintain mainframe applications using languages like COBOL, PL/1, or Assembler.
- Code Maintenance: Perform code reviews, debugging, and optimization to ensure high-quality code.
- System Integration: Integrate mainframe applications with other systems and technologies.
- Testing and Quality Assurance: Test and validate mainframe applications to ensure they meet business requirements.
- Troubleshooting: Troubleshoot mainframe issues and resolve technical problems.
Technical Skills
- Mainframe Systems: Strong understanding of mainframe systems, including z/OS, z/VM, or z/VSE.
- Programming Languages: Proficiency in mainframe programming languages like COBOL, PL/1, or Assembler.
- Database Management: Knowledge of mainframe database management systems like DB2 or IMS.
- Job Scheduling: Familiarity with job scheduling tools like CA-7 or Control-M.
- Security: Understanding of mainframe security concepts and best practices.

Hands-on knowledge in machine learning, deep learning, TensorFlow, Python, NLP 2. Stay up to date on the latest AI emergences relevant to the business domain. 3. Conduct research and development processes for AI strategies. 4. Experience developing and implementing generative AI models, with a strong understanding of deep learning techniques such as GPT, VAE, and GANs. 5. Experience with transformer models such as BERT, GPT, RoBERTa, etc, and a solid understanding of their underlying principles is a plus
Good-to-Have 1. Have knowledge of software development methodologies, such as Agile or Scrum 2. Have strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience. 3. Have experience with natural language processing (NLP) techniques and tools, such as SpaCy, NLTK, or Hugging Face 4. Ensure the quality of code and applications through testing, peer review, and code analysis. 5. Root cause analysis and bugs correction 6. Familiarity with version control systems, preferably Git. 7. Experience with building or maintaining cloud-native applications. 8. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud is Plus
SN Role descriptions / Expectations from the Role
1 Design, Develop and configure GenAI applications to meet the business requirements.
2 Optimizing existing generative AI models for improved performance, scalability, and efficiency
3 Developing and maintaining AI pipelines, including data preprocessing, feature extraction, model training, and evaluation
- Work with the team in capacity of GCP Data Engineer on day-to-day activities.
- Solve problems at hand with utmost clarity and speed.
- Work with Data analysts and architects to help them solve any specific issues with tooling/processes.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Python/Java/React.js, AirFlow ETL skills - GCP services (BigQuery, Dataflow, Cloud SQL, Cloud Functions, Data Lake.
- Design and build production data pipelines from ingestion to consumption within a big data architecture.
- GCP BQ modeling and performance tuning techniques.
- RDBMS and No-SQL database experience.
- Knowledge on orchestrating workloads on cloud.
Very good understanding of Azure Monitor, Log Analytics, Alerting, Dashboards & Workbooks.
Good working knowledge of Kusto Query Language (KQL) or Python for data analysis and querying.
Good working knowledge of Azure Functions for automation.
Apply Powershell programming skills to develop clean code that is stable, consistent, and well[1]performing.
Knowledge of Terraform, ARM or Bicep for infrastructure as code.
GitHub/Gitlab for devops pipelines.
Experience in monitoring infrastructure, managing escalations, and ensuring timely issue resolution to maintain operational stability and stakeholder satisfaction.
Good-to-Have
Knowledge of Terraform, ARM or Bicep for infrastructure as code
Teradata Developer Job Description
A Teradata Developer is responsible for designing, developing, and implementing data warehousing solutions using Teradata. Here's a brief overview:
Key Responsibilities
- Data Warehousing: Design and develop data warehousing solutions using Teradata.
- ETL Development: Develop ETL (Extract, Transform, Load) processes to load data into Teradata.
- SQL Development: Write complex SQL queries to support business intelligence and reporting.
- Performance Optimization: Optimize Teradata database performance for fast query execution.
- Data Modeling: Design and implement data models to support business requirements.
Technical Skills
- Teradata: Strong understanding of Teradata database management system.
- SQL: Proficiency in writing complex SQL queries.
- ETL: Experience with ETL tools like Informatica PowerCenter or Teradata PT.
- Data Warehousing: Knowledge of data warehousing concepts and best practices.
- Data Modeling: Understanding of data modeling concepts and techniques.
Responsibility of / Expectations from the Role
1 Driving the Informatica changes and supporting production issues.
2 Customer interaction as required
3 Requirement gathering and analysis
4 Resolution of Server related issues, GUI related issues
5 Taking the end to end ownership of Informatica changes.
Role descriptions / Expectations from the Role
Develop, deploy, and maintain Power Platform solutions, including canvas apps, model-driven apps, and flows.
Collaborate with cross-functional teams to define, design, and ship new features.
Troubleshoot and resolve issues related to Power Platform applications.
Ensure the security, scalability, and reliability of Power Platform solutions.
You will be joining a fast-growing product team within the End User Technology Services organisation. You will be part of the build out of the capability in the region and you will liaise with our customers across all the LEADING BANK business lines
DataStage Developer Job Description
A DataStage Developer is responsible for designing, developing, and implementing data integration solutions using IBM InfoSphere DataStage. Here's a brief overview:
Key Responsibilities
- Data Integration: Design and develop data integration jobs using DataStage to extract, transform, and load (ETL) data from various sources.
- Job Development: Develop and test DataStage jobs to meet business requirements.
- Data Transformation: Use DataStage transformations to cleanse, aggregate, and transform data.
- Performance Optimization: Optimize DataStage jobs for performance and scalability.
- Troubleshooting: Troubleshoot DataStage issues and resolve data integration problems.
Technical Skills
- DataStage: Strong understanding of IBM InfoSphere DataStage and its components.
- ETL: Experience with ETL concepts and data integration best practices.
- Data Transformation: Knowledge of data transformation techniques and DataStage transformations.
- SQL: Familiarity with SQL and database concepts.
- Data Modeling: Understanding of data modeling concepts and data warehousing.


Python Developer Job Description
A Python Developer is responsible for designing, developing, and deploying software applications using the Python programming language. Here's a brief overview:
Key Responsibilities
- Software Development: Develop high-quality software applications using Python.
- Problem-Solving: Solve complex problems using Python programming language.
- Code Maintenance: Maintain and update existing codebases to ensure they remain efficient and scalable.
- Collaboration: Collaborate with cross-functional teams to identify and prioritize project requirements.
- Testing and Debugging: Write unit tests and debug applications to ensure high-quality code.
Technical Skills
- Python: Strong understanding of Python programming language and its ecosystem.
- Programming Fundamentals: Knowledge of programming fundamentals, including data structures, algorithms, and object-oriented programming.
- Frameworks and Libraries: Familiarity with popular Python frameworks and libraries, such as Django, Flask, or Pandas.
- Database Management: Understanding of database management systems, including relational databases and NoSQL databases.
- Version Control: Knowledge of version control systems, including Git.

Similar companies
About the company
At Deqode, our purpose is to help businesses solve complex problems using new-age technologies. We provide enterprise blockchain solutions to businesses.
Jobs
115
About the company
Jobs
4
About the company
Jobs
29
About the company
We are a product engineering company that empowers other startups and enterprises by building simple and elegant software solutions. Through our expertise in the domains of AI and Enterprise Applications, we have helped brands such as Unilever, IndiaMART, GreytHR, Fyle, Skylark Drones, etc to craft world-class products and improve their business. We are churning out amazing software for our clients located across the globe from our headquarters in Bengaluru.
Codemonk is on a mission to transform the way industries work by leveraging the power of AI, Blockchain and IoT. There is something special when you know that every line of code that you write impacts thousands of human lives!
By joining us you can expect newness and challenges every day. As a member of the team, you will be part of shaping the company's future fuelling the growth and defining the culture.
Jobs
4
About the company
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.
With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.
We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).
Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Jobs
281
About the company
Meltwater is a SaaS-based global product company specializing in media monitoring, social media intelligence, and consumer insights. They were publicly listed in Norway in 2020.
At Meltwater, we believe that when you have the right people in the right working environment, great things happen. Our best-in-class technology empowers our 27,000 customers around the world to analyze over a billion pieces of data each day and make better business decisions.
Our award-winning culture is our north star and drives everything we do – from striving to create an environment where all employees do their best work, to delivering customer value by continuously innovating our products — and making sure to celebrate our successes and have fun along the way.
We’re proud of our diverse team of 2,200+ employees in 50 locations across 25 countries around the world. No matter where you are, you’ll work with people who care about your success and get the support you need to reach your goals.
So, in a nutshell, that's Meltwater. We love working here, and we think you will too.
Why Meltwater?
- A supportive team to motivate you to new heights. At Meltwater, people care about people. From leadership to recent grads, we care deeply about each other and know that we can go farther together.
- An award-winning culture where we respect one another, celebrate each other's wins, and strive to achieve our goals and grow as a team every day. We also make time for having fun, prioritizing health and well-being, and giving 100% to life outside of work.
- Exciting career opportunities to gain world-class training, have an impact from the get-go, take your career trajectory in your own hands, and work with some of the world’s biggest brands.
Jobs
3
About the company
PlanetSpark is an intermediary technology platform that provides online classes to K8 learners. PlanetSpark is on a journey to make the traditional and unorganized tuitions obsolete through its virtual classroom.
Jobs
23
About the company
Jobs
7
About the company
We are India's fastest-growing window and door system brand based out of Hyderabad. We have completed 800+ successful projects through a network of 150+ fabricators present across 100+ locations across the country. Our proprietary and tech-first approach (PartnerGate) has made us a preferred partner for businesses and a trusted brand for customers in the segment.
We currently manufacture UPVC windows and doors. Our products combine durability, energy efficiency, and modern design to meet the growing demands of homeowners and builders.
Our products are developed by our in-house innovation and product development team for Indian climatic conditions meeting international standards (EN ISO, ASTM). They are manufactured in ISO-certified manufacturing facilities with a production capacity exceeding 100,000MT.
We have raised $1.04 million (about Rs 8.5 crore) in seed funding led by Incubate Fund India, Titan Capital, Partners Fund Japan, Superb Capital, and MamaEarth founder Varun Alagh. Within one year we have grown 7X.
We operate in a B2B2C, B2B and B2C model. Eventually, B2C will be our primary channel with a pan-India presence. Within the decade, we also plan to export our products to markets in the world.
Jobs
3