11+ Oracle BRM Jobs in Pune | Oracle BRM Job openings in Pune
Apply to 11+ Oracle BRM Jobs in Pune on CutShort.io. Explore the latest Oracle BRM Job opportunities across top companies like Google, Amazon & Adobe.
Domain: Telecom
Location: All Metros
Skills: BRM or Amdocs Portal Infranet
Must have: 3-4 yrs of experience in BRM Application
Technical Experience:
-Translate designs and wireframes into high-quality code
-Design, build and maintain high performance, reusable, and reliable Java code
-Ensure the best possible performance, quality, and responsiveness of the application
-Identify and correct bottlenecks and fix bugs
-Help maintain code quality, organization, and automatization
-Strong knowledge of Android SDK, different versions of Android, and how to deal with different screen sizes
-Familiarity with RESTful APIs to connect Android applications to back-end services
-Strong knowledge of Android UI design principles, patterns, and best practices
-Experience with offline storage, threading, and performance tuning
-Ability to design applications around natural user interfaces, such as touch
-Familiarity with the use of additional sensors, such as gyroscopes and accelerometers
-Knowledge of the open-source Android ecosystem and the libraries available for common tasks
-Ability to understand business requirements and translate them into technical requirements
-Familiarity with cloud message APIs and push notifications
-A knack for benchmarking and optimization
-Understanding of Googles Android design principles and interface guidelines
-Proficient understanding of code versioning tools, such as Git
Position: Sr SDET
Experience: 5 years
Location: Pune (Amar tech park)
Mode: 5 days a week from office
What’s the role?
We are looking for a Senior SDET to contribute to the design and building of our software offerings. Our engineering team works with .NET in an Agile environment. We use Azure DevOps Pipelines and Release. The definition of 'DONE' includes writing automated tests so our full regression on releases will effortless. We strive to do thing right and not just band-aid the problems. Our management is engaged and looking for feedback on how we can become better, iteratively.
You will have the opportunity to…
- Participate in story refinement sessions to ensure the details and dependencies are well
- defined & understood with considerations for testing
- Collaborate with Product Owner and Developers as a team to deliver quality
- Writing and maintaining tests cases, executing, and perform ad-hoc testing with the end-user
- experience in mind
- Automate test cases based on priority before the close of the sprint
- Participate in code review to ensure commits are up to standards
- Monitor the Azure Release for regression bugs and or issues with environments
- Work with geo-distributed teams to coordinate testing of features
- Be vocal during Retrospective meetings and follow up on process improvements
- Managing quality and bugs reports in all stages of releases
Our ideal candidate will have…
- 5+ years of experience as an SDET
- 3+ years of experience with Selenium WebDriver and Grid
- 2+ years of experience of testing web API through code
- Strong experience of OOP design with C# programming skill
- Ability to write complex SQL queries for data validation
- Knowledge of test methodologies and their corresponding tools
- Ability to recognize errors and assess risks within applications and or processes
- Working knowledge with Visual Studio 2016+ and Git
- 1+ year of experience with of CI/CD pipelines
- An understanding of the ROI and risk for ah-hoc testing, test automation, code coverage and
- feature coverage
- A passion for design, development, and quality.
Job Location: Pune/Bangalore/ Hyderabad/ Indore
- Very good knowledge of MuleSoft components.
- Prior work experience in setting up a COE using MuleSoft Integration Software.
- Good understanding of various integration patterns.
- Ability to deliver projects independently with little or no supervision.
- Previous experience working in a multi-geographic team.
- Previous experience with best programming practices.
- Good written and oral communication skills – English.
Responsibilities:
- Contribute to managing end-to-end implementation project life cycle, driving optimization of operating models
- Determine Project Scope, Objectives and Dependencies
- Develop and manage a detailed project schedule, work plan and key milestones
- Determine and Define Project Resource Allocations
- Manage Risks, Issues, Assumptions and Dependencies on Project by assigning appropriate ownership and escalations
- Capture project health metrics, maintaining a central repository for all projects and project information that provide insights into processes and frameworks that work
- Establish and execute Project Governances viz., Steerco Meetings, Weekly Cadences and Track Action Items to closure
- Conduct governance calls for ongoing projects and ensuring project commitments are being met by Darwinbox’s implementation team as well as the Client’s implementation team
- Develop and manage a detailed project schedule and work plan
- Collaborate with Client on requirement changes and initiate change management process
- Identify, document and update the project scope for changes
- Initiate appropriate review and approvals on the changes
- Evaluate and analyze the impact of changes and communicate with stakeholders
- Seek approvals and commitments on the scope, cost and schedule
- Ensure SOP adherence, time-logging, effectively making use/ driving adoption of Project Management tools and methodologies
- Review/prepare Project Plans for implementation projects and monitor resource allocation
- Identify Stakeholder Communication Requirements on project performance metrics
- Collaborate with Product Teams on Product enhancements
- Document impact and prioritize changes to Product Backlog and timelines
- Stakeholder Reporting - Preparing status reports, highlighting risks to the project timelines, proactively escalating potential red flags, and recommending mitigation solutions
- Track Project Closure / Handover activities to downstream teams.
General Requirements:
- B. Tech or an equivalent degree
- MBA or an equivalent degree is preferred but not mandatory
- Excellent written and verbal communication skills
- Ability to work under pressure and adaptable to change
- Excellent time management and problem-solving skills
- Proficiency in MS Office tool.
- Any experience in MS SharePoint, MS Power Automate, Excel Macros, and Project Management Tools like JIRA, and Zoho Projects is an added advantage
- Any Project Management (PMP/ Prince2/Agile CSM) or Agile certification is mandatory
- 8-10 years of prior experience in a project management role is preferred with strong experience in HRMS implementation.
MNC IT company in Pune
Job Description Creation and delivery of End to End solutions for customers addressing requirements through technical architecture in consideration of process design, ServiceNow Technical Best Practice and Standards Define, at an architectural and design level of detail, technical solutions aligned with client’s business problems and work on scoping complex service engagements typically involving multiple ServiceNow products and complex integrations with client applications/systems Hands-On development , Design details , Integrations ( REST/SOAP), Service now platform in-depth knowledge , Scoped Applications creations/managements, Modules like ITSM, ITOM, ITBM,SecOps,HR. Experience working on Glide, Ajax, client scripting, business rules, UI Policies etc .Web Technologies (XML, HTML, Angular, Bootstrap, JavaScript, Web Services, etc.) and working in a SaaS environment Knowledge of technical components such as LDAP, VPN, SSL, SAML/SSO and other widespread enterprise technologies Knowledge and experience in the following ServiceNow product areas is mandatory: Incident, Change, Config/CMDB, Service Catalog/Request FF/Workflow, Services Portal, Domain separation and multi-language
Certifications : ServiceNow System Administrator Certification Exam (CSA) – (Required) : System Administration Advanced and ServiceNow Fundamentals ServiceNow Certified Application Developer Exam (CAD) – (Preferred) Application Development Fundamentals Domain Separation Implementation Platform Implementation Scripting in ServiceNow Fundamentals Service Portal Advanced Service Portal Fundamentals ITSM Exam (CIS-ITSM) – (Required) ITSM Fundamental ITSM Implementation
Job to be performed (Expectation Setting) 1. Customer facing role. From Requirement gathering, High level solution creation, effort estimation, Implementation, Integration and unit testing, support and drive UAT, Deployment and Go-Live support. 2. Customers point of contact for all Automation or ServiceNow requirements. 3. Ensure technical deliverables to our customers are complete, consistent, high quality, and on time, and deliver valued outcomes. 4. Demonstrate interpersonal skills, customer-centric attitude, ability to deal with cultural diversity, proven team player and team builder, Is committed to driving customer value realization while ensuring all actions contribute towards a positive experience for the customer.
|
Role and Responsibilities
- Execute data mining projects, training and deploying models over a typical duration of 2 -12 months.
- The ideal candidate should be able to innovate, analyze the customer requirement, develop a solution in the time box of the project plan, execute and deploy the solution.
- Integrate the data mining projects embedded data mining applications in the FogHorn platform (on Docker or Android).
Core Qualifications
Candidates must meet ALL of the following qualifications:
- Have analyzed, trained and deployed at least three data mining models in the past. If the candidate did not directly deploy their own models, they will have worked with others who have put their models into production. The models should have been validated as robust over at least an initial time period.
- Three years of industry work experience, developing data mining models which were deployed and used.
- Programming experience in Python is core using data mining related libraries like Scikit-Learn. Other relevant Python mining libraries include NumPy, SciPy and Pandas.
- Data mining algorithm experience in at least 3 algorithms across: prediction (statistical regression, neural nets, deep learning, decision trees, SVM, ensembles), clustering (k-means, DBSCAN or other) or Bayesian networks
Bonus Qualifications
Any of the following extra qualifications will make a candidate more competitive:
- Soft Skills
- Sets expectations, develops project plans and meets expectations.
- Experience adapting technical dialogue to the right level for the audience (i.e. executives) or specific jargon for a given vertical market and job function.
- Technical skills
- Commonly, candidates have a MS or Ph.D. in Computer Science, Math, Statistics or an engineering technical discipline. BS candidates with experience are considered.
- Have managed past models in production over their full life cycle until model replacement is needed. Have developed automated model refreshing on newer data. Have developed frameworks for model automation as a prototype for product.
- Training or experience in Deep Learning, such as TensorFlow, Keras, convolutional neural networks (CNN) or Long Short Term Memory (LSTM) neural network architectures. If you don’t have deep learning experience, we will train you on the job.
- Shrinking deep learning models, optimizing to speed up execution time of scoring or inference.
- OpenCV or other image processing tools or libraries
- Cloud computing: Google Cloud, Amazon AWS or Microsoft Azure. We have integration with Google Cloud and are working on other integrations.
- Decision trees like XGBoost or Random Forests is helpful.
- Complex Event Processing (CEP) or other streaming data as a data source for data mining analysis
- Time series algorithms from ARIMA to LSTM to Digital Signal Processing (DSP).
- Bayesian Networks (BN), a.k.a. Bayesian Belief Networks (BBN) or Graphical Belief Networks (GBN)
- Experience with PMML is of interest (see www.DMG.org).
- Vertical experience in Industrial Internet of Things (IoT) applications:
- Energy: Oil and Gas, Wind Turbines
- Manufacturing: Motors, chemical processes, tools, automotive
- Smart Cities: Elevators, cameras on population or cars, power grid
- Transportation: Cars, truck fleets, trains
About FogHorn Systems
FogHorn is a leading developer of “edge intelligence” software for industrial and commercial IoT application solutions. FogHorn’s Lightning software platform brings the power of advanced analytics and machine learning to the on-premise edge environment enabling a new class of applications for advanced monitoring and diagnostics, machine performance optimization, proactive maintenance and operational intelligence use cases. FogHorn’s technology is ideally suited for OEMs, systems integrators and end customers in manufacturing, power and water, oil and gas, renewable energy, mining, transportation, healthcare, retail, as well as Smart Grid, Smart City, Smart Building and connected vehicle applications.
Press: https://www.foghorn.io/press-room/">https://www.foghorn.io/press-room/
Awards: https://www.foghorn.io/awards-and-recognition/">https://www.foghorn.io/awards-and-recognition/
- 2019 Edge Computing Company of the Year – Compass Intelligence
- 2019 Internet of Things 50: 10 Coolest Industrial IoT Companies – CRN
- 2018 IoT Planforms Leadership Award & Edge Computing Excellence – IoT Evolution World Magazine
- 2018 10 Hot IoT Startups to Watch – Network World. (Gartner estimated 20 billion connected things in use worldwide by 2020)
- 2018 Winner in Artificial Intelligence and Machine Learning – Globe Awards
- 2018 Ten Edge Computing Vendors to Watch – ZDNet & 451 Research
- 2018 The 10 Most Innovative AI Solution Providers – Insights Success
- 2018 The AI 100 – CB Insights
- 2017 Cool Vendor in IoT Edge Computing – Gartner
- 2017 20 Most Promising AI Service Providers – CIO Review
Our Series A round was for $15 million. Our Series B round was for $30 million October 2017. Investors include: Saudi Aramco Energy Ventures, Intel Capital, GE, Dell, Bosch, Honeywell and The Hive.
About the Data Science Solutions team
In 2018, our Data Science Solutions team grew from 4 to 9. We are growing again from 11. We work on revenue generating projects for clients, such as predictive maintenance, time to failure, manufacturing defects. About half of our projects have been related to vision recognition or deep learning. We are not only working on consulting projects but developing vertical solution applications that run on our Lightning platform, with embedded data mining.
Our data scientists like our team because:
- We care about “best practices”
- Have a direct impact on the company’s revenue
- Give or receive mentoring as part of the collaborative process
- Questions and challenging the status quo with data is safe
- Intellectual curiosity balanced with humility
- Present papers or projects in our “Thought Leadership” meeting series, to support continuous learning
at WNS Global Services
We have an opening in WNS for Pune location for Chartered accountant (Fresher) .Candidate should have cleared CA exams and should be comfortable working in US shifts.