11+ IBM Cognos Framework Manager Jobs in Hyderabad | IBM Cognos Framework Manager Job openings in Hyderabad
Apply to 11+ IBM Cognos Framework Manager Jobs in Hyderabad on CutShort.io. Explore the latest IBM Cognos Framework Manager Job opportunities across top companies like Google, Amazon & Adobe.

Job Title : Cognos BI Developer
Experience : 6+ Years
Location : Bangalore / Hyderabad (Hybrid)
Notice Period : Immediate Joiners Preferred (Candidates serving notice with 10–15 days left can be considered)
Interview Mode : Virtual
Job Description :
We are seeking an experienced Cognos BI Developer with strong data modeling, dashboarding, and reporting expertise to join our growing team. The ideal candidate should have a solid background in business intelligence, data visualization, and performance analysis, and be comfortable working in a hybrid setup from Bangalore or Hyderabad.
Mandatory Skills :
Cognos BI, Framework Manager, Cognos Dashboarding, SQL, Data Modeling, Report Development (charts, lists, cross tabs, maps), ETL Concepts, KPIs, Drill-through, Macros, Prompts, Filters, Calculations.
Key Responsibilities :
- Understand business requirements in the BI context and design data models using Framework Manager to transform raw data into meaningful insights.
- Develop interactive dashboards and reports using Cognos Dashboard.
- Identify and define KPIs and create reports to monitor them effectively.
- Analyze data and present actionable insights to support business decision-making.
- Translate business requirements into technical specifications and determine timelines for execution.
- Design and develop models in Framework Manager, publish packages, manage security, and create reports based on these packages.
- Develop various types of reports, including charts, lists, cross tabs, and maps, and design dashboards combining multiple reports.
- Implement reports using macros, prompts, filters, and calculations.
- Perform data warehouse development activities and ensure seamless data flow.
- Write and optimize SQL queries to investigate data and resolve performance issues.
- Utilize Cognos features such as master-detail reports, drill-throughs, bookmarks, and page sets.
- Analyze and improve ETL processes to enhance data integration.
- Apply technical enhancements to existing BI systems to improve their performance and usability.
- Possess solid understanding of database fundamentals, including relational and multidimensional database design.
- Hands-on experience with Cognos Data Modules (data modeling) and dashboarding.
Backend Architect:
Technology: node js, DynamoDB / Mongo DB
Roles:
- Design & implement Backend Services.
- Able to redesign the architecture.
- Designing & implementation of application in MVC & Microservice.
- 9+ years of experience developing service-based applications using Node.js.
- Expert-level skills in developing web applications using JavaScript, CSS and HTML5.
- Experience working on teams that practice BDD (Business Driven Development).
- Understanding of micro-service architecture and RESTful API integration patterns.
- Experience using Node.js for automation and leveraging NPM for package management
- Solid Object Oriented design experience, and creating and leveraging design patterns.
- Experience working in a DevOps/Continuous Delivery environment and associated toolsets (i.e. Jenkins, Puppet etc.)
Desired/Preferred Qualifications :
- Bachelor's degree or equivalent experience
- Strong problem solving and conceptual thinking abilities
- Desire to work in a collaborative, fast-paced, start-up like environment
- Experience leveraging node.js frameworks such as Express.
- Experience with distributed source control management, i.e. Git
-
Experience in Spring Boot, Jenkins, Git, Hibernate, Kubernetes, and Docker
-
Experience in the development of scalable and extensible systems using Java
-
Proficiency in Database technology such as MySQL, Oracle and MongoDB
-
Routines Sync & Async
-
Solid and fluent understanding of algorithm and data structures
-
Excellent software design, problem-solving and debugging skills
-
Demonstrated high ownership in previous projects
-
Excellent Communication Skills
-
Good understanding of Elastic Search, Redis
-
Experience working in the cloud environment, preferably AWS*.
-
Write unit tests and run automated tests through CI/CD
-
Ability to learn new and existing technologies
-
Experience in building cloud SaaS or PaaS solutions/products

Required Skills:
1. Strong hands-on experience on Boomi Process building & deployment, Boomi EDI Integration, Boomi API management, alert framework/exception handling, connectors/listeners, integration packs usage and all aspects of design, development, performance tuning and operational support of the software suite.
2. Define clients' integration requirements through analysis, and design
3. Provide technical direction and assistance to clients regarding their integration needs
4. Plan, design, implement and document integration processes varying in levels of complexity
5. Develop test plan specifications, test and debug processes according to plan. Work with customer on user acceptance testing.
6. Provide on-going education and technical assistance to current and prospective customers
7. Display initiative, self-motivation and deliver high quality work while at the same time, meeting all deadlines for both internal and external customers
8. Experience on multiple integration frameworks (TIBCO, Mulesoft, Oracle SOA etc) highly preferred.
9. Experience in implementing at least 1 to 2 full cycle of projects involving integration of Trading Partners, ERP and/or non-ERP systems, on-premise and Cloud/hybrid integration's.
10. Proven ability to design and optimize business processes and to integrate business processes across disparate systems.
11. Excellent analysis skills and the ability to develop processes and methodologies.
12. Candidate having experience in scripting (JavaScript/Groovy) is added advantage
13. Understanding and Knowledge of JSON, XML, Flat Files (CSV, Fixed-Width), EDI and Knowledge of enterprise systems (CRM , ERP [NetSuite]) and MDH
We are not looking for someone who is:
1. Never worked on customer facing role with international customers.
2. Never lead a team with minimum of three members
3. Never lead the high level design, Technical design discussions and can define work break down structure
4. Not flexible for 24x7 rotational shifts and not ready to work from office.


About Monarch:
At Monarch, we’re leading the digital transformation of farming. Monarch Tractor augments both muscle and mind with fully loaded hardware, software, and service machinery that will spur future generations of farming technologies.
With our farmer-first mentality, we are building a smart tractor that will enhance (not replace) the existing farm ecosystem, alleviate labor availability, and cost issues, and provide an avenue for competitive organic and beyond farming by providing mechanical solutions to replace harmful chemical solutions. Despite all the cutting-edge technology we will incorporate, our tractor will still plow, till, and haul better than any other tractor in its class. We have all the necessary ingredients to develop, build and scale the Monarch Tractor and digitally transform farming around the world.
Job Description:
This role requires you to be a tech-savvy contributor in translating customer needs and user requirements into an interactive web-based application. You must be a good problem solver, have keen attention to detail and take responsible actions in ensuring the application is optimized both in technology and in delivering the best user experience.
Work closely with product and engineering teams to create elegant, responsive and interactive interfaces.
Handcraft UI/UX designs into prototypes
Write reusable components and address performance issues of the product.
Requirements:
Bachelor Degree / Masters in Engineering (ECE or CSE preferred)
A minimum of 3 years of relevant work experience as a Web, UI or Frontend Engineer
Work experience in React JS / React TS and Data structure is a must.
Sound knowledge in JavaScript, HTML and CSS,
Working knowledge of Git: creating, merging branches, cherry-picking commits, examining the diff between two hashes.
Be able to write clean and maintainable code with attention to performance
Keen eye for design, and have experience of debugging using browser console.
An ability to perform in a fast-paced environment and bring in optimal flow for rapidly changing design.
AWS experience is a plus although not mandatory.
What you will get:
At Monarch Tractor, you’ll play a key role on a capable, dedicated, high-performing team of rock stars. Our compensation package includes a competitive salary, excellent health, dental and vision benefits, and company equity commensurate with the role you’ll play in our success.
Job description
Job Title: Business Development Executive (Antivirus/Cyber Security/ End Point Security/ MDM)
Location: Ahmedabad, Baroda, Rajkot.
Responsibilities:
Distribute IT Network Security Products, Endpoint Security, DLP, Software, Desktop/Laptop sales, and Networking Products through Channel Partners.
Identify and pursue new business opportunities in the assigned region.
Conduct market research to understand industry trends and competitor activities.
Achieve sales targets and contribute to the overall growth of the business.
Collaborate with internal teams to develop effective sales strategies.
Provide excellent customer service and address client needs.
Prepare and deliver presentations to prospective clients.
Requirements:
Proven experience in business development or sales.
Strong communication and negotiation skills.
Knowledge of the local market in Ahmedabad, Baroda, and Rajkot.
Ability to work independently and as part of a team.
Results-oriented with a focus on achieving sales targets.
Qualifications :
- Graduate/BE/BSc.IT/BCA.
- Minimum 4 years of experience in IT Channel Network Distributing products and solutions.
Salary Range : Best in the Industry
Job Types: Full-time, Permanent
- Provision Dev Test Prod Infrastructure as code using IaC (Infrastructure as Code)
- Good knowledge on Terraform
- In-depth knowledge of security and IAM / Role Based Access Controls in Azure, management of Azure Application/Network Security Groups, Azure Policy, and Azure Management Groups and Subscriptions.
- Experience with Azure and GCP compute, storage and networking (we can also look for GCP )
- Experience in working with ADLS Gen2, Databricks and Synapse Workspace
- Experience supporting cloud development pipelines using Git, CI/CD tooling, Terraform and other Infrastructure as Code tooling as appropriate
- Configuration Management (e.g. Jenkins, Ansible, Git, etc...)
- General automation including Azure CLI, or Python, PowerShell and Bash scripting
- Experience with Continuous Integration/Continuous Delivery models
- Knowledge of and experience in resolving configuration issues
- Understanding of software and infrastructure architecture
- Experience in Paas, Terraform and AKS
- Monitoring, alerting and logging tools, and build/release processes Understanding of computing technologies across Windows and Linux


Interfaces with other processes and/or business functions to ensure they can leverage the
benefits provided by the AWS Platform process
Responsible for managing the configuration of all IaaS assets across the platforms
Hands-on python experience
Manages the entire AWS platform(Python, Flask, RESTAPI, serverless framework) and
recommend those that best meet the organization's requirements
Has a good understanding of the various AWS services, particularly: S3, Athena, Python code,
Glue, Lambda, Cloud Formation, and other AWS serverless resources.
AWS Certification is Plus
Knowledge of best practices for IT operations in an always-on, always-available service model
Responsible for the execution of the process controls, ensuring that staff comply with process
and data standards
Qualifications
Bachelor’s degree in Computer Science, Business Information Systems or relevant experience and
accomplishments
3 to 6 years of experience in the IT field
AWS Python developer
AWS, Serverless/Lambda, Middleware.
Strong AWS skills including Data Pipeline, S3, RDS, Redshift with familiarity with other components
like - Lambda, Glue, Step functions, CloudWatch
Must have created REST API with AWS Lambda.
Python relevant exp 3 years
Good to have Experience working on projects and problem solving with large scale multivendor
teams.
Good to have knowledge on Agile Development
Good knowledge on SDLC.
Hands on AWS Databases, (RDS, etc)
Good to have Unit testing exp.
Good to have CICD working knowledge.
Decent communication, as there will be client interaction and documentation.
Education (degree): Bachelor’s degree in Computer Science, Business Information Systems or relevant
experience and accomplishments
Years of Experience: 3-6 years
Technical Skills
Linux/Unix system administration
Continuous Integration/Continuous Delivery tools like Jenkins
Cloud provisioning and management – Azure, AWS, GCP
Ansible, Chef, or Puppet
Python, PowerShell & BASH
Job Details
JOB TITLE/JOB CODE: AWS Python Develop[er, III-Sr. Analyst
RC: TBD
PREFERRED LOCATION: HYDERABAD, IND
POSITION REPORTS TO: Manager USI T&I Cloud Managed Platform
CAREER LEVEL: 3
Work Location:
Hyderabad


SpringML is looking to hire a top-notch Data Engineer who is passionate about working with data and using the latest distributed framework to process large datasets. As a Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets. In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
Title-Data Engineers
Location-Hyderabad
Work Timings-10.00 AM-06.00 PM
RESPONSIBILITIES:
Ability to work as a member of a team assigned to design and implement data integration solutions.
Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
Propose design solutions and recommend best practices for large scale data analysis
SKILLS:
B.tech degree in computer science, mathematics or other relevant fields.
Good Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.

MOURI Tech, a global enterprise solutions provider is commit
