11+ IBM AIX Jobs in Bangalore (Bengaluru) | IBM AIX Job openings in Bangalore (Bengaluru)
Apply to 11+ IBM AIX Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest IBM AIX Job opportunities across top companies like Google, Amazon & Adobe.
|
Overall 5+ years of experience required in Finacle Development/Support |
Data Engineer MS Data Engineer + Snowflake/databrics Required Skills: · 6 to 8 years of being a practitioner in data engineering or a related field. Should have experience in Snowflake or Databricks. Experience with data processing frameworks like Apache Spark or Hadoop. Experience working on Databricks. Familiarity with cloud platforms (AWS, Azure) and their data services. Experience with data warehousing concepts and technologies. Experience with message queues and streaming platforms (e.g., Kafka). Excellent communication and collaboration skills. Ability to work independently and as part of a geographically distributed team.
Basic Qualifications
- Need to have a working knowledge of AWS Redshift.
- Minimum 1 year of designing and implementing a fully operational production-grade large-scale data solution on Snowflake Data Warehouse.
- 3 years of hands-on experience with building productized data ingestion and processing pipelines using Spark, Scala, Python
- 2 years of hands-on experience designing and implementing production-grade data warehousing solutions
- Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
- Excellent presentation and communication skills, both written and verbal
- Ability to problem-solve and architect in an environment with unclear requirements
Location = Delhi / Bengaluru
Profile = Assistant Manager / Manager - International & Domestic Sourcing (Metal Scraps). This is an urgent requirement.
Below are strict criterias.
1. Should have 4+ years of experience in international & domestic sourcing / purchasing / procurement of Aluminium / Metal / Alloy Scraps of various kinds i.e; Tense, Troma, TT, Zorba.
2. Should've travelled to international locations for aforesaid sourcing purpose. (not strict).
What does the role look like?
We are looking for a high-energy, detail-oriented, and technically-savvy Quality Assurance Lead Should be an excellent team manager as well as individual contributor capable of understanding and driving test execution with minimal help. Who can work towards understanding application performance, functionality, and features to great depth enabling them to report issues, take ownership and drive them to closure.
What will you be doing?
Develop and execute automated tests and test plans
Efficiently execute test cases across all functional areas of the products(API and App) Review product user interface for conformity to design guidelines
Find, isolate, document, regress, and track bugs through resolution
Interpret and report testing results, and be a vocal proponent for quality in every phase of the development process
Work with Engineering and product to understand the overall product requirements and technical architecture and how each feature is implemented Ensure the highest quality product delivery with security.
Evaluating and integrate open source and in house developed toolsets
What will you need ?
Bachelor/Master degree in CS or related field from a reputed college
Experience in manual testing for Application and APIs
Experience in automation testing using available framework.
Strong knowledge of QA methodologies, testing frameworks and tools
Demonstrated experience in test planning, test design, test execution and reporting.
Excellence in technical communication with peers and non-technical partners.
Understanding & experience with software design pattern, restful APIs and microservice architecture
Prior experience in start-ups or health-tech will be a plus
About KAFQA
At Kafqa, we are building an online global performing arts academy. For our students, we offer live classes conducted by experienced & expert artists. For our artists, we offer a bouquet of opportunities to monetize their skills including being an instructor at Kafqa. Our mission is to serve artists in their journey from their first steps in the art to success on the largest stages in the world.
Founder
The founder is Shariq Plasticwala. He is a graduate of IIT Bombay & Stanford GSB. He was part of the founding team of Amazon India where he played a key role for over 8 years. Among his roles at Amazon, he was the CEO of Amazon’s first joint venture in India and a Board Member of Amazon’s payments business.
Role
The role is a unique opportunity to be a part of the core team at KAFQA. The responsibilities of the role are per the below:
- Understand the needs of employees across different segments across their lifecycle (discovery of Kafqa opportunities, application experience, interview experience, decision experience, onboarding, settling down, scaling with the organization, attrition & separation) & design best in class experiences always thinking technology first
- Understand the needs of instructors separately and design best in class experiences for them in partnership with head of departments and other teams.
- Meet compliances mandated by central & state governments.
The role is based in Bangalore, India and reports directly to the Founder, CEO of KAFQA.
Experience, Qualifications & Person Type
The ideal candidate is someone who –
- Has 5+ years of experience preferably in a HR role
- Thinks first principle to re-imagine experiences for employees & challenge status quo
- Thinks technology first
- Is comfortable prioritizing
- Is creative and can innovate
- Interested in the performing arts & preferably is a performer themselves
- Is willing to fulfil all responsibilities above & go beyond as needed
- Is a self-starter, operates with autonomy & can deal with ambiguity while being innovative & frugal
XressBees – a logistics company started in 2015 – is amongst the fastest growing companies of its sector. Our
vision to evolve into a strong full-service logistics organization reflects itself in the various lines of business like B2C
logistics 3PL, B2B Xpress, Hyperlocal and Cross border Logistics.
Our strong domain expertise and constant focus on innovation has helped us rapidly evolve as the most trusted
logistics partner of India. XB has progressively carved our way towards best-in-class technology platforms, an
extensive logistics network reach, and a seamless last mile management system.
While on this aggressive growth path, we seek to become the one-stop-shop for end-to-end logistics solutions. Our
big focus areas for the very near future include strengthening our presence as service providers of choice and
leveraging the power of technology to drive supply chain efficiencies.
Job Overview
XpressBees would enrich and scale its end-to-end logistics solutions at a high pace. This is a great opportunity to join
the team working on forming and delivering the operational strategy behind Artificial Intelligence / Machine Learning
and Data Engineering, leading projects and teams of AI Engineers collaborating with Data Scientists. In your role, you
will build high performance AI/ML solutions using groundbreaking AI/ML and BigData technologies. You will need to
understand business requirements and convert them to a solvable data science problem statement. You will be
involved in end to end AI/ML projects, starting from smaller scale POCs all the way to full scale ML pipelines in
production.
Seasoned AI/ML Engineers would own the implementation and productionzation of cutting-edge AI driven algorithmic
components for search, recommendation and insights to improve the efficiencies of the logistics supply chain and
serve the customer better.
You will apply innovative ML tools and concepts to deliver value to our teams and customers and make an impact to
the organization while solving challenging problems in the areas of AI, ML , Data Analytics and Computer Science.
Opportunities for application:
- Route Optimization
- Address / Geo-Coding Engine
- Anomaly detection, Computer Vision (e.g. loading / unloading)
- Fraud Detection (fake delivery attempts)
- Promise Recommendation Engine etc.
- Customer & Tech support solutions, e.g. chat bots.
- Breach detection / prediction
An Artificial Intelligence Engineer would apply himself/herself in the areas of -
- Deep Learning, NLP, Reinforcement Learning
- Machine Learning - Logistic Regression, Decision Trees, Random Forests, XGBoost, etc..
- Driving Optimization via LPs, MILPs, Stochastic Programs, and MDPs
- Operations Research, Supply Chain Optimization, and Data Analytics/Visualization
- Computer Vision and OCR technologies
The AI Engineering team enables internal teams to add AI capabilities to their Apps and Workflows easily via APIs
without needing to build AI expertise in each team – Decision Support, NLP, Computer Vision, for Public Clouds and
Enterprise in NLU, Vision and Conversational AI.Candidate is adept at working with large data sets to find
opportunities for product and process optimization and using models to test the effectiveness of different courses of
action. They must have knowledge using a variety of data mining/data analysis methods, using a variety of data tools,
building, and implementing models, using/creating algorithms, and creating/running simulations. They must be
comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion
for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.
Roles & Responsibilities
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Building cloud services in Decision Support (Anomaly Detection, Time series forecasting, Fraud detection,
Risk prevention, Predictive analytics), computer vision, natural language processing (NLP) and speech that
work out of the box.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Build core of Artificial Intelligence and AI Services such as Decision Support, Vision, Speech, Text, NLP, NLU,
and others.
● Leverage Cloud technology –AWS, GCP, Azure
● Experiment with ML models in Python using machine learning libraries (Pytorch, Tensorflow), Big Data,
Hadoop, HBase, Spark, etc
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to
drive business solutions.
● Mine and analyze data from company databases to drive optimization and improvement of product
development, marketing techniques and business strategies.
● Assess the effectiveness and accuracy of new data sources and data gathering techniques.
● Develop custom data models and algorithms to apply to data sets.
● Use predictive modeling to increase and optimize customer experiences, supply chain metric and other
business outcomes.
● Develop company A/B testing framework and test model quality.
● Coordinate with different functional teams to implement models and monitor outcomes.
● Develop processes and tools to monitor and analyze model performance and data accuracy.
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Deliver machine learning and data science projects with data science techniques and associated libraries
such as AI/ ML or equivalent NLP (Natural Language Processing) packages. Such techniques include a good
to phenomenal understanding of statistical models, probabilistic algorithms, classification, clustering, deep
learning or related approaches as it applies to financial applications.
● The role will encourage you to learn a wide array of capabilities, toolsets and architectural patterns for
successful delivery.
What is required of you?
You will get an opportunity to build and operate a suite of massive scale, integrated data/ML platforms in a broadly
distributed, multi-tenant cloud environment.
● B.S., M.S., or Ph.D. in Computer Science, Computer Engineering
● Coding knowledge and experience with several languages: C, C++, Java,JavaScript, etc.
● Experience with building high-performance, resilient, scalable, and well-engineered systems
● Experience in CI/CD and development best practices, instrumentation, logging systems
● Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights
from large data sets.
● Experience working with and creating data architectures.
● Good understanding of various machine learning and natural language processing technologies, such as
classification, information retrieval, clustering, knowledge graph, semi-supervised learning and ranking.
● Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest,
Boosting, Trees, text mining, social network analysis, etc.
● Knowledge on using web services: Redshift, S3, Spark, Digital Ocean, etc.
● Knowledge on creating and using advanced machine learning algorithms and statistics: regression,
simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
● Knowledge on analyzing data from 3rd party providers: Google Analytics, Site Catalyst, Core metrics,
AdWords, Crimson Hexagon, Facebook Insights, etc.
● Knowledge on distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, MySQL, Kafka etc.
● Knowledge on visualizing/presenting data for stakeholders using: Quicksight, Periscope, Business Objects,
D3, ggplot, Tableau etc.
● Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
● Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests, and proper usage, etc.) and experience with applications.
● Experience building data pipelines that prep data for Machine learning and complete feedback loops.
● Knowledge of Machine Learning lifecycle and experience working with data scientists
● Experience with Relational databases and NoSQL databases
● Experience with workflow scheduling / orchestration such as Airflow or Oozie
● Working knowledge of current techniques and approaches in machine learning and statistical or
mathematical models
● Strong Data Engineering & ETL skills to build scalable data pipelines. Exposure to data streaming stack (e.g.
Kafka)
● Relevant experience in fine tuning and optimizing ML (especially Deep Learning) models to bring down
serving latency.
● Exposure to ML model productionzation stack (e.g. MLFlow, Docker)
● Excellent exploratory data analysis skills to slice & dice data at scale using SQL in Redshift/BigQuery.
• Develop White-box test cases from API functional specification
• Write maintainable scripts for API Automation testing
• Follow release cycles and commitment to deadlines
• Collaborate with the team and communicate effectively
• Ability to work in a fast-paced start-up
CANDIDATE MUST HAVE
• Modular/reusable test scripts using Java / JavaScript
• Load testing
• Test tools e.g. JMeter, Apache Benchmark, etc
DESIRED SKILLS & EXPERIENCE
• BE/BTech in Computer Science or related technical discipline
• Good knowledge of Java / JavaScript-based test frameworks
• Should have experience in building API automation from scratch
• Experience in writing modular/reusable test scripts using Java / JavaScript
• Experience with performance and load testing
• Experience with test tools e.g. JMeter, Apache Benchmark, etc.
• Knowledge of JSON-based Restful Web Services
• Experience in working with penetration testing tools will be a plus
• Knowledge of GIT, Bitbucket, JIRA, Linux Shell Script, and CI/CD process
Type: Individual contributor with good hands on proficiency.
Must have
- Strong proficiency in at least one of Java, Ruby, Python
- Exposure to databases: any of PostgreSQL, MySQL, Apache Cassandra
- Any NoSQL database experience is a plus
- Exposure to AWS cloud infrastructure: EC2 or S3
- Proficiency with Git
- MUST: Using REST to make API calls.
Great to have:
- Experience working with one or more middleware, enterprise bus, queueing frameworks
- Any of Memcached/Redis, Apache Kafka / RabbitMQ / PubSub+ / AmazonMQ
Soft skills:
- Appreciation for clean and well documented code
What will you do at Tradyl:
(Examples for illustration only)
- Build a shipping service module that is called by our website to query shipping rates from India to a destination country. Configure this to so that an Ops person can update shipping costs as and when they change. Own deployment and monitoring of this service.
- Use Zapier to build a workflow to export a MixPanel report into a Google sheet every day.
- Change our supplier portal (built on bubble.io) to make an API call to our customer facing site, whenever a supplier modifies his profile.
- Write an alert mechanism that identifies catalogues with insufficient information and makes them non discoverable, which can run every day.
- Work with Business Team to design a workflow for product inwarding using Airtable. Write a small app within Airtable so that whenever a product is updated as “shipped” in airtable, it updates the customer facing website.
- Use an open source dashboarding framework to create a quick dashboard to track important business events.
We are hiring for an Inside Sales Executive.
Company Name - Ginesys
Url - http://www.ginesys.in/">www.ginesys.in
Role - Inside Sales
About Company –
Ginesys is one of the leading retail software companies in India and is the most preferred solution for fashion and lifestyle brands and supermarkets. We have offices in 5 metros (Gurugram, Kolkata, Mumbai, Bengaluru, and Hyderabad) and a team strength of 150+. Our customers are typically SMB retailers and lifestyle brands and currently, our user base is 20,000+. We recently raised PE funding and are looking to expand our team strength in all the core areas to ensure we are delivering cutting edge solutions to this market in the best possible way. Ginesys is a great Launchpad for anyone looking for a career in SaaS, Enterprise, ERP, and Retail Solutions.
Responsibilities, Duties and Required Skills
- Research new prospects and generate interest through emailers and cold calling
- Follow up with prospects for feedback on the sales process
- Team up with channel partners to build a pipeline of exciting deals
- Update CRM software on existing prospects and customers contacts
- Route qualified opportunities to the appropriate sales executives for further action
- To assist the marketing team in achieving overall activities through active planning and participation in trade shows and meets.
Required Skills
- Must maintain and exhibit the highest standards of ethical conduct
- Proven ability to generate a pipeline for the sales team in B2B sales
- Have worked in Software / IT companies
- Excellent creative thinking skills with an emphasis on developing innovative solutions to complex problems that may not have one clear answer.
- High attention to detail and excellent organizational skills, adhering to deadlines while ensuring accuracy
If Interested, please share your updated CV and also your availability for face to face interview. For any query, you can reach me on the below-mentioned number
Role & Responsibilities
We are a workplace thriving on young energy, on mentorship, craziness, drive and commitment to making a difference. This is what we could open for you!
We are looking for an experienced and accomplished Frontend Developer (ReactJS). The role will suit a natural problem solver. The core role will be in a technical leadership capacity.
Primary Responsibilities
- Meeting with the development team to discuss user interface ideas and applications.
- Reviewing application requirements and interface designs.
- Identifying web-based user interactions.
- Developing and implementing highly-responsive user interface components using React concepts.
- Troubleshooting interface software and debugging application codes.
- Developing and implementing front-end architecture to support user interface concepts.
- Monitoring and improving front-end performance.
- Documenting application changes and developing updates.
Ideal Candidate:
- In-depth knowledge of JavaScript, CSS, HTML and front-end languages.
- Must have minimum 1.5 years of experience in React JS.
- Knowledge in Web services, Web API, REST services, HTML, CSS3
- Knowledge of REACT tools including React.js, Webpack, Enzyme, Redux, and Flux.
- Excellent problem-solving ability and desire to learn new technologies and platforms.
- Experience with user interface design.
- Experience with browser-based debugging and performance testing software.
- Design, build, and maintain efficient, reusable, and reliable codes by setting expectations and features priorities throughout development life cycle
- Good teamwork.
- Excellent troubleshooting skills.
- Good project management skills.
- Flexible in his/her technical approach with a willingness to try new, creative technical solutions.
Working Mode: Only Office
Job Location : Bangalore
Interview Process
First Round – Telephonic Discussion
Second Round – Zoom Call




