
Computer Operator duties and responsibilities
- Identifying and correcting file and system errors.
- Performing data processing operations according to a business production schedule.
- Performing backup procedures to reduce the risk of data loss.
- Maintaining computer equipment and inventory and organizing repairs as needed.

About vflyorions technologies pvt ltd
About
Similar jobs
Strong Software Engineering Profile
Mandatory (Experience 1): Must have 5+ years of experience using Python to design software solutions.
Mandatory (Skills 1): Strong working experience with Python (with Django framework experience) and Microservices architecture is a must.
Mandatory (Skills 2): Must have experience with event-driven architectures using Kafka
Mandatory (Skills 3): Must have Experience in DevOps practices and container orchestration using Kubernetes, along with cloud platforms like AWS, GCP, or Azure
Mandatory (Company): Product companies, Experience working in fintech, banking, or product companies is a plus.
Mandatory (Education): From IIT (Candidate should have done bachelor degree Btech or Dual degree Btech+Mtech or Intergrated Msc), From other premium institutes NIT, MNNIT, VITS, BITS (Candidates should have done B.E/B.Tech)
Preferred
Role Description
This is a full-time on-site role for a Telesales Specialist located in Noida. The Telesales Specialist will be responsible for customer service, communication, sales, customer support, and training in the freight brokerage industry.
**Qualifications**
- Strong customer service and support skills
- Excellent communication abilities
- Proven sales and business development skills
- Ability to generate company revenue
- Track record of achieving sales targets
- Training capabilities
- Strong interpersonal skills
- Previous experience in freight brokerage is a plus
- Bachelor’s degree in Business, Marketing, or a related field
Mandatory Criteria
- Looking for candidates from Bangalore and NP less than or equal to 20 days
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
- 6+ years of experience in quality assurance or software testing, with at least 3 years focused on test automation and 2+ years in a leadership or senior role.
- Solid knowledge of SQL Experience in architecting and implementing automated testing frameworks . Strong in programming/scripting languages such as Python, Java, or JavaScript for automation development.
- Expertise with automation tools like Selenium, Playwright, Appium, or RestAssured, and integrating them into CI/CD workflows.
- Proven leadership skills, including mentoring junior engineers and managing team deliverables in an agile environment.
- Experience with test management tools (e.g., TestRail, qTest) and defect tracking systems (e.g., Jira).
- Deep understanding of testing best practices, including functional, regression, performance, and security testing.
- Ability to analyze system architecture and identify key areas for test coverage and risk mitigation.
- Experience with containerization technologies (Docker, Kubernetes) and cloud platforms (AWS preferred). Understanding of performance, security, and load testing tools (e.g., JMeter, OWASP ZAP).
- Familiarity with observability and monitoring tools (e.g., ELK Stack, Datadog, Prometheus, Grafana) for test environment analysis.
If interested, kindly share your resume on 82008 31681
experience in Python, Django, Flask, and related frameworks in Unix/Linux or Windows
environment.
Hands-on experience in a full-stack python application development.
Design and develop Python web applications adhering to microservices framework considering
performance and ability to scale on demand.
Good experience working with RDBMS (Relational Database Management System).
Experience with Microservices architecture, containers & Docker based applications. Experience in
developing web applications and REST APIs using Flask/Django Framework (JSON, XML, etc.)
Experience working on Apache HTTP/HTTPS or any other webapp servers.
Package code and create executables/binaries in python.
Experience in Pandas, SciPy, NumPy, Pandas libraries.
Strong unit testing and debugging skills good understanding of threading limitations of Python, and
multi-process architecture.
Experience in managing the Source Code Base through Version Control tool like SVN, Git, and
Bitbucket, etc.
Thorough Understanding of OOPS concepts.
Experience working in an Agile development environment.
Good understanding of database (PostgreSQL/MySQL/Oracle/SQL Server).
Good communication and organization skills, with a logical approach to problem solving, good time
management, and task prioritization skills.
Requirements: Skills and Qualifications
8-10 years of experience in Python, Django, Flask, and related frameworks in Unix/Linux or Windows
environment, preferably in banking domain.
Language: Python
Frameworks: Django, Flask.
Libraries: Sqlalchemy, NumPy, SciPy and Pandas…Etc.
OS: Windows, Linux/Unix
Version Controls: Git, and Bitbucket.
Databases: MySQL, Oracle, SQL Server, PostgreSQL.
Containers: Docker
Note - Urgently looking for people serving Notice Period Or Immediate Joiners
Job Description:
- 3+ years of experience in Functional testing with good foundation in technical expertise
- Experience in Capital Markets/Investment Banking domain is MUST
- Exposure to API testing tools like SoapUI and Postman
- Well versed with SQL
- Hands on experience in debugging issues using Unix commands
- Basic understanding of XML and JSON structures
- Knowledge of FitNesse is good to have
Location:
Pune/Mumbai
About Wissen Technology:
· The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.
· Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.
· Our workforce consists of 550+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
· Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.
· Globally present with offices US, India, UK, Australia, Mexico, and Canada.
· We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
· Wissen Technology has been certified as a Great Place to Work®.
· Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.
· Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.
· We have served clients across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include Morgan Stanley, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.
Website : www.wissen.com
Role Must Have's
- Startup Hiring Experience in both Tech and Non Tech domains
- 2 years + Experience
- Demonstrated experience in closing mid and senior roles at startups
- Demonstrated experience in using LinkedIn recruiter, GitHub, and social media channels for recruiting
- Excellent communication and interpersonal skills
- Nice to Have: Computer Science Academic Background
Responsibilities
- Build an effective candidate pipeline through Inbound/Outbound sourcing channels with minimal dependency
- Co-ordinate for interview panel, schedule interviews, and run debriefs efficiently
- Be able to recruit active/passive candidates for Tech roles
- Possess strong ability to pre-screen interview candidates
- Build and maintain a network of potential candidates through pro-active market research and ongoing relationship management
- Communicate well with the hiring manager and stakeholders to ensure an effective interview process.
Behaviors & Experiences
- Independently Driven
- Experience managing and prioritizing multiple searches, projects, and client relationships
- High Learning Agility
- Extraordinarily Conscientious
IT Asset Management (Hardware and Software), Configuration, Implementation and Integration with third party systems.
IT Asset Management –
- Develop Lifecycle Management Policies, Processes, Procedures, Compliance, Governance and Documentation based on ITIL best practices.
- Have experience in Jira Configuration and Asset Management. Configuration of dashboards and reports.
- Have worked as Jira Admin and designed workflows for PM,SM & Asset Management.
- Must have handled Asset management project in past and configured the required workflows & lifecycle management.
- Deploy service assets and configuration management processes and ensure these ones are fit for purpose and operational at any time.
- Anticipate future IT needs versus existing infrastructures architecture, perform impact assessments, and investigate design options.
- Maintain the asset and configuration management databases (AMDB & CMDB) and ensure completeness and correctness, act on any potential issues.
- Monitor services assets (e.g. laptop, token, etc.) ensure compliance as needed. Produce technical documentation, reports, data analysis, etc. and identify opportunities for improvement and automation.
- Lead and organize processes controls and reviews on a periodic basis to ensure that the hardware overall services are running according to the defined standards. Follow latest evolutions in field of expertise and share best practices within the internal community.
- Carry out any other task in line with the main purpose of the job.
- Provide Asset Management training to the customer Ensure compliance with relevant regulations and standards.
- Preference: HealthCare domain

Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
- Bachelors/Master Degree in Computer Science or related field.
- 4+ years of experience in developing web-based, E-Commerce applications on Salesforce Commerce Cloud (SFCC).
- SFCC Developer certification is preferred.
- Knowledge on SGJC and SFRA is must.
- Experience in working with ISML, Pipelines, Pipelets, JS-Controllers, Digital Scripts,
B2C Commerce API.
- Experience in working with Forms, Service and Job / Integration frameworks.
- Strong understanding on SFCC out of the box features and good experience with storefront customizations.
- Experience in working with third party services like Payment, Tax, Analytics, Fraud, Ratings & Reviews etc. using LINK and custom cartridges.
- Good understanding on multi-site, multi-currency and multi-locale.
- Object oriented analysis and design using common design patterns.
- Experience in working on performance optimization and troubleshooting.
- In-depth knowledge on SFCC Business Manager.
- Knowledge on build and deployment.
- Knowledge on Source Control Management Systems like GitHub/BitBucket and branching model.
Job Specifications:
- Strong and innovative approach to problem solving and finding solutions
- Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution
- Ability to multi-task under pressure and work independently with No / minimal supervision.
- Ability to Negotiate and prioritize when under pressure
- Should be able to finish work with no supervision from lead
- Excellent communicator (written and verbal, formal and informal)
- Manage conflicts effectively
- Good Listening skills
Role and Responsibilities
The candidate for the role will be responsible for enabling single view for the data from multiple sources.
- Work on creating data pipelines to graph database from data lake
- Design graph database
- Write Graph Database queries for front end team to use for visualization
- Enable machine learning algorithms on graph databases
- Guide and enable junior team members
Qualifications and Education Requirements
B.Tech with 2-7 years of experience
Preferred Skills
Must Have
Hands-on exposure to Graph Databases like Neo4J, Janus etc..
- Hands-on exposure to programming and scripting language like Python and PySpark
- Knowledge of working on cloud platforms like GCP, AWS etc.
- Knowledge of Graph Query languages like CQL, Gremlin etc.
- Knowledge and experience of Machine Learning
Good to Have
- Knowledge of working on Hadoop environment
- Knowledge of graph algorithms
- Ability to work on tight deadlines











