Stipend will be based on the performance.

About Nirvana Infomedia
About
Connect with the team
Similar jobs
Must have experience in Leather or garment export house.
· Export Documentation: Prepare and manage all pre-shipment and post-shipment export documents, including invoices, packing lists, Bill of Lading, Certificate of Origin, and other required shipping documents.
· Coordination with Buyers & Shipping Lines: Communicate with buyers, freight forwarders, customs agents, and shipping lines to ensure timely shipment and smooth documentation processing.
· Customs & Compliance: Ensure compliance with export regulations, customs laws, and DGFT policies. Handle export incentives, duty drawbacks, and export obligations.
· Banking & Payment Processing: Handle LC (Letter of Credit), bank negotiations, remittances, and documentation required for payment processing.
· Coordination with Internal Teams: Work closely with production, logistics, and finance teams to ensure the timely execution of export orders.
· Tracking & Reporting: Maintain shipment records, track shipments, and update management on export status.
· Client Communication: Handle buyer queries related to export documentation, shipping, and compliance.
mail updated resume with current salary-
email: jobs[at]glansolutions[dot]com
satish: 88 O2 74 97 43

What’s this role really about?
At Digicorp, we don’t just ship code — we build products that matter. We're looking for a well-rounded full-stack developer who’s confident working with both the front end and the back end. If you’re someone who enjoys owning features end-to-end and is equally excited about clean APIs and sleek UI, read on.
What kind of person thrives here?
- You’ve got 3+ years of solid experience in Node.js and React, and at least 1 year of Angular under your belt.
- You’re comfortable switching between React and Angular without thinking twice.
- You care about the user experience, not just the functionality.
- You’re someone who asks questions, proposes ideas, and doesn’t just “follow specs.”
- You like working closely with designers, QAs, and product folks.
- You write clean, maintainable, and testable code.
- You’ve worked with Python for at least 6 months and naturally look for ways to apply AI to solve real problems, that’s a big plus in our eyes.
- Familiarity with Azure infrastructure is also something we value.
What will you do here (besides writing code)?
- Build and improve web apps that real users rely on
- Break down problems with the team and solve them together
- Review others’ code — and have yours reviewed (with kindness!)
- Keep performance, security, and scalability in mind as you build
- Contribute to a culture of learning, sharing, and quality
We’re not just hiring a developer. We’re adding a team member who’ll help us think, build, learn, and grow — together.

Good experience in the Extraction, Transformation, and Loading (ETL) of data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,
Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL tool on Oracle, and SQL Server Databases.
Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with
project scope, Analysis, requirements gathering, data modeling, ETL Design, development,
System testing, Implementation, and production support.
Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts
and Dimensions
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Developed mapping parameters and variables to support SQL override.
Created applets to use them in different mappings.
Created sessions, configured workflows to extract data from various sources, transformed data,
and loading into the data warehouse.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Modified existing mappings for enhancements of new business requirements.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Prepared migration document to move the mappings from development to testing and then to
production repositories
Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex
SQL queries using PL/SQL.
Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica
/Talend sessions as well as performance tuning of mappings and sessions.
Experience in all phases of Data warehouse development from requirements gathering for the
data warehouse to develop the code, Unit Testing, and Documenting.
Extensive experience in writing UNIX shell scripts and automation of the ETL processes using
UNIX shell scripting.
Experience in using Automation Scheduling tools like Control-M.
Hands-on experience across all stages of Software Development Life Cycle (SDLC) including
business requirement analysis, data mapping, build, unit testing, systems integration, and user
acceptance testing.
Build, operate, monitor, and troubleshoot Hadoop infrastructure.
Develop tools and libraries, and maintain processes for other engineers to access data and write
MapReduce programs.
Strong problem-solving capabilities and able to independently think through a challenge.
Demonstrated ability to learn new technology quickly
Every project can be different, and there may not be a “template” to work off of. Need someone who can be resourceful and has a “figure it out” attitude
2+ years of experience in Salesforce or other development
BS/MS in Computer Science, Engineering, or related years of experience and technical skills
Salesforce certifications preferred, e.g. Certified Salesforce Administrator, Certified Salesforce Platform App Builder, Platform Developer II
Experience in Salesforce CRM app development with strong expertise in Sales Cloud, Service Cloud, and / or Force.com at an enterprise level
Experience with SFDC Administrative tasks like creating Profiles, Roles, User Security Models, Page Layouts, Email Services, Dashboards, Tasks, and Events
Experience with Lightning Components, Design System, APEX Classes, Process Builder, Triggers, Visualforce, Approval Processes, Aura, and Flow
Verify and validate complete end-to-end testing with the business and client/carrier
Expert knowledge of Object-Oriented programming
Experience with database design concepts and use of SOQL, SOSL, and SQL
Experience with Salesforce API and Web Services (REST/SOAP/Bulk)
Experience with environment management, release management, code versioning, deployment methodologies, and CI/CD tools
Utilize strong written and oral communication skills to regularly update stakeholders on project status, e.g. project phase, issues/roadblocks, go-live dates, etc.
- Would be responsible for end to end loan processing, from initial receipt of loan application to credit approval stage within TAT parameters.
- Reviewing the loan application.
- Reviewing the credit application
- Credit underwriting
- KYC verfication.


XressBees – a logistics company started in 2015 – is amongst the fastest growing companies of its sector. Our
vision to evolve into a strong full-service logistics organization reflects itself in the various lines of business like B2C
logistics 3PL, B2B Xpress, Hyperlocal and Cross border Logistics.
Our strong domain expertise and constant focus on innovation has helped us rapidly evolve as the most trusted
logistics partner of India. XB has progressively carved our way towards best-in-class technology platforms, an
extensive logistics network reach, and a seamless last mile management system.
While on this aggressive growth path, we seek to become the one-stop-shop for end-to-end logistics solutions. Our
big focus areas for the very near future include strengthening our presence as service providers of choice and
leveraging the power of technology to drive supply chain efficiencies.
Job Overview
XpressBees would enrich and scale its end-to-end logistics solutions at a high pace. This is a great opportunity to join
the team working on forming and delivering the operational strategy behind Artificial Intelligence / Machine Learning
and Data Engineering, leading projects and teams of AI Engineers collaborating with Data Scientists. In your role, you
will build high performance AI/ML solutions using groundbreaking AI/ML and BigData technologies. You will need to
understand business requirements and convert them to a solvable data science problem statement. You will be
involved in end to end AI/ML projects, starting from smaller scale POCs all the way to full scale ML pipelines in
production.
Seasoned AI/ML Engineers would own the implementation and productionzation of cutting-edge AI driven algorithmic
components for search, recommendation and insights to improve the efficiencies of the logistics supply chain and
serve the customer better.
You will apply innovative ML tools and concepts to deliver value to our teams and customers and make an impact to
the organization while solving challenging problems in the areas of AI, ML , Data Analytics and Computer Science.
Opportunities for application:
- Route Optimization
- Address / Geo-Coding Engine
- Anomaly detection, Computer Vision (e.g. loading / unloading)
- Fraud Detection (fake delivery attempts)
- Promise Recommendation Engine etc.
- Customer & Tech support solutions, e.g. chat bots.
- Breach detection / prediction
An Artificial Intelligence Engineer would apply himself/herself in the areas of -
- Deep Learning, NLP, Reinforcement Learning
- Machine Learning - Logistic Regression, Decision Trees, Random Forests, XGBoost, etc..
- Driving Optimization via LPs, MILPs, Stochastic Programs, and MDPs
- Operations Research, Supply Chain Optimization, and Data Analytics/Visualization
- Computer Vision and OCR technologies
The AI Engineering team enables internal teams to add AI capabilities to their Apps and Workflows easily via APIs
without needing to build AI expertise in each team – Decision Support, NLP, Computer Vision, for Public Clouds and
Enterprise in NLU, Vision and Conversational AI.Candidate is adept at working with large data sets to find
opportunities for product and process optimization and using models to test the effectiveness of different courses of
action. They must have knowledge using a variety of data mining/data analysis methods, using a variety of data tools,
building, and implementing models, using/creating algorithms, and creating/running simulations. They must be
comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion
for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.
Roles & Responsibilities
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Building cloud services in Decision Support (Anomaly Detection, Time series forecasting, Fraud detection,
Risk prevention, Predictive analytics), computer vision, natural language processing (NLP) and speech that
work out of the box.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Build core of Artificial Intelligence and AI Services such as Decision Support, Vision, Speech, Text, NLP, NLU,
and others.
● Leverage Cloud technology –AWS, GCP, Azure
● Experiment with ML models in Python using machine learning libraries (Pytorch, Tensorflow), Big Data,
Hadoop, HBase, Spark, etc
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to
drive business solutions.
● Mine and analyze data from company databases to drive optimization and improvement of product
development, marketing techniques and business strategies.
● Assess the effectiveness and accuracy of new data sources and data gathering techniques.
● Develop custom data models and algorithms to apply to data sets.
● Use predictive modeling to increase and optimize customer experiences, supply chain metric and other
business outcomes.
● Develop company A/B testing framework and test model quality.
● Coordinate with different functional teams to implement models and monitor outcomes.
● Develop processes and tools to monitor and analyze model performance and data accuracy.
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Deliver machine learning and data science projects with data science techniques and associated libraries
such as AI/ ML or equivalent NLP (Natural Language Processing) packages. Such techniques include a good
to phenomenal understanding of statistical models, probabilistic algorithms, classification, clustering, deep
learning or related approaches as it applies to financial applications.
● The role will encourage you to learn a wide array of capabilities, toolsets and architectural patterns for
successful delivery.
What is required of you?
You will get an opportunity to build and operate a suite of massive scale, integrated data/ML platforms in a broadly
distributed, multi-tenant cloud environment.
● B.S., M.S., or Ph.D. in Computer Science, Computer Engineering
● Coding knowledge and experience with several languages: C, C++, Java,JavaScript, etc.
● Experience with building high-performance, resilient, scalable, and well-engineered systems
● Experience in CI/CD and development best practices, instrumentation, logging systems
● Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights
from large data sets.
● Experience working with and creating data architectures.
● Good understanding of various machine learning and natural language processing technologies, such as
classification, information retrieval, clustering, knowledge graph, semi-supervised learning and ranking.
● Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest,
Boosting, Trees, text mining, social network analysis, etc.
● Knowledge on using web services: Redshift, S3, Spark, Digital Ocean, etc.
● Knowledge on creating and using advanced machine learning algorithms and statistics: regression,
simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
● Knowledge on analyzing data from 3rd party providers: Google Analytics, Site Catalyst, Core metrics,
AdWords, Crimson Hexagon, Facebook Insights, etc.
● Knowledge on distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, MySQL, Kafka etc.
● Knowledge on visualizing/presenting data for stakeholders using: Quicksight, Periscope, Business Objects,
D3, ggplot, Tableau etc.
● Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
● Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests, and proper usage, etc.) and experience with applications.
● Experience building data pipelines that prep data for Machine learning and complete feedback loops.
● Knowledge of Machine Learning lifecycle and experience working with data scientists
● Experience with Relational databases and NoSQL databases
● Experience with workflow scheduling / orchestration such as Airflow or Oozie
● Working knowledge of current techniques and approaches in machine learning and statistical or
mathematical models
● Strong Data Engineering & ETL skills to build scalable data pipelines. Exposure to data streaming stack (e.g.
Kafka)
● Relevant experience in fine tuning and optimizing ML (especially Deep Learning) models to bring down
serving latency.
● Exposure to ML model productionzation stack (e.g. MLFlow, Docker)
● Excellent exploratory data analysis skills to slice & dice data at scale using SQL in Redshift/BigQuery.
Location : Hyderabad
Qualifications: BE/BTECH from EE/EC
Experience : 3-5 Years
Desired Candidate Profile:
-Development of Embedded device drivers, Ethernet, USB, SPI.
-Able to read data sheets.
-Good debug skills.
-Board bring up skill.
-Worked on application processor type boards( T2080/2081 etc. )
-Strong in embedded C
-Proficient in board bring up and device drivers is needed.
-System programming and building multi-platform SDK on Linux or other OS is desirable.
-Onsite system integration.
-Should have hands on debugging experience.
-Candidate is expected to have strong knowledge of C and C++.
-Understanding and experience using structured programming techniques.
Job Type: Full-time
Salary: ₹300,000.00 - ₹700,000.00 per year
Schedule:
Morning shift
Ability to commute/relocate:
hyderabad, Hyderabad - 500003, Telangana: Reliably commute or planning to relocate before starting work (Required)
Experience:
total work: 3 years (Preferred)
- A strong work ethic and sense of commitment
- Responsible for end-to-end recruitment cycle i.e., sourcing qualified candidates to deploying selected candidates with the clients.
- Understanding & analyzing the requirement of the position based on client’s specifications.
- Experience in End-to-end recruitment: Sourcing to on boarding candidates.
- Sourcing of candidates via internet postings, networking, headhunting and internal Database.
- Ability to source resumes quickly and work on multiple positions simultaneously Thorough screening applications for identifying quality candidates.
- Sound knowledge on Sourcing Candidates thru RESDEX, Monster & various Job Sites.
- Excellent knowledge in Boolean function & usage of various job board database.
- Foster long-term relationships with candidates and clients
- Draft job descriptions push them through all talent acquisition channels and coordinate candidate sourcing with the same.
- Doing junior to middle level hiring for all verticals.
- Review applicants to verify if position requirements are met.
- Format resumes meeting client expectations.
- Telephonic Screening of the profiles checking for Technical Fitment, Behavioral Fitment, Stability etc.
- Utilize your innovative and effective convincing and negotiation skills to impress and attract top talent.
- Follow up with candidates and client and maintain records of progress.
- Craft and send personalized recruiting emails with current job openings to passive candidates.
- Keep up to date with new technological trends and products
- You need to be comfortable being an individual contributor and working in an ambiguous, chaotic and fast paced work environment.
- Strong decision-making skills
- Experience in end-to-end recruitment Quick learner and a passion for bridging talent with opportunity
Minimum qualifications -
- Bachelor’s degree
- Telephonic Skills
- Interviewing & Negotiation skills
- Excellent interpersonal communication skills
- Excellent English Communication Written & Verbal, with very good E-Mail writing skills & etiquette.
- Professionalism
- Strong social aptitude and ability to build relationships
- Technical knowledge to be able to talk to job prospects and understand qualifications

1- Must have experience in all kind of customization work in Magento
2- Must have experience in Magento 1 & Magento 2
3- Must have experience in PHP
4- Must have magento theme customization experience
5- Must have maganto extension integration experience
6- Must completed at least 10 project(around) in magento
7- Must work independently or with minimal supervision

