Company Description
Miratech is an IT services and outsourcing company that provides services to multinational organizations all over the world. Our highly professional team achieves success with 99% of IT projects in financial, telecommunication, and technology domains. Founded in 1989, Miratech has its headquarters in New York, USA; with R&D centers in Poland, Philippines, Slovakia, Spain, and Ukraine. Technical complexity is our passion, stability is our standard, friendly work environment is our style. We empower our employees to grow together with the company, to achieve ambitious goals, and to be a part of the international relentless team which helps the visionaries to change the world.
Job Description
We are looking for a Bot Developer to join our team, who will help us working on solutions and implementing technologies.
The ideal candidate will have strong knowledge of technologies and programming languages through which conversational Chatbots are developed. A good understanding of dialog systems and development using Microsoft framework to develop and program conversational Chatbots is required.
Responsibilities:
- Designing and implementing voice and chat bots
- Troubleshoot and resolve issues related to voice/chat bots.
- Assist in planning and estimating development projects/sprints.
- Take part in code reviews and contribute to team knowledge sharing.
- Provide technical guidance and support to other team members.
- Work in an agile environment, using methodologies like Scrum or Kanban
Qualifications
- 2-3 years of experience in BOT development using node.js
- Strong experience in developing BOTs using Azure Bot Framework.
- Conversational AI - ML Based Using Azure Cognitive Services
- Conversational AI - ML Based services to build Conversational Bot using LUIS.
- Experience in working with REST API calls, JSON, and systems integration.
Secondary Skills
- Ability to work with business and technology teams to build and deploy an analytical solution as per client needs.
- Ability to multi-task, solve problems and think strategically.
- Strong communication and collaboration skills

About Miratech Group
Similar jobs
Hiring for Azure Data Engineers.
Location: Bangalore
Employment type: Full-time, permanent
website: www.amazech.com
Qualifications:
B.E./B.Tech/M.E./M.Tech in Computer Science, Information Technology, Electrical or Electronic with good academic background.
Experience and Required Skill Sets:
• Minimum 5 years of hands-on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse, Azure Blob, Azure Storage Explorer
• Experience in Data warehouse/analytical systems using Azure Synapse.
Proficient in creating Azure Data Factory pipelines for ETL processing; copy activity, custom Azure development, Synapse, etc.
• Knowledge of Azure Data Catalog, Event Grid, Service Bus, SQL, and Purview.
• Good technical knowledge in Microsoft SQL Server BI Suite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI
• Design and develop batch and real-time streaming of data loads to data warehouse systems
Other Requirements:
A Bachelor's or Master's degree (Engineering or computer-related degree preferred)
Strong understanding of Software Development Life Cycles including Agile/Scrum
Responsibilities:
• Ability to create complex, enterprise-transforming applications that meet and exceed client expectations.
• Responsible for the bottom line. Strong project management abilities. Ability to encourage the team to stick to timelines.
Job Title: Mulesoft Lead/Architect
Location: Hyderabad/ Pune
Prolifics, a pioneering technology solutions provider, is seeking a talented and inventive Mulesoft developer to join our dynamic team. At Prolifics, we believe in empowering our employees to push the boundaries of innovation, think outside the box, and deliver game-changing solutions to our clients.
To excel in this role, you should have:
- 5+ years of hands-on experience with MuleSoft and API Management with Excellent knowledge of SOA, ESB concepts.
- Configure APIs, proxy endpoints, API portals and API analytics based on technical specifications with MuleSoft API manager.
- Deep understanding of VPC and DLB’s.
- Deep understanding of REST, HTTP, MQ, JSON, XML and SOA Design and develop enterprise services using RAML in Mule, REST based APIs, SOAP Web Services and use of different mule connectors.
- Strong understanding and experience with security implementations (e.g. SSL/mutual SSL,mTLS SAML, Auth, OAuth).
- Strong understanding and experience with Mule 4 & Dataweave Language
- Ability to interface with clients, technology partners, testing, architecture, and analysis groups.
- Implement Policies in API Manager
- Deep experience with Any point Platform, Flow Design, API Design. Data weave, Cloud Hub, Runtime Fabric, API Management
- Knowledge of Jenkins / CICD Process, AZURE/AWS CICD Process.
- Prior Experience in integration with different applications like SOAP, REST WS, SAP, Salesforce, DB, etc. through Mule soft connectors.
- Creation of Mapping document by working with source and target systems
- Deploy APIs to Cloud hub, Runtime Fabric, On-prem workers, etc.
- MuleSoft RTF Deployment experience a plus
- Prior Experience in Munit, Automation Testing using JMeter.
- SonarQube experience is a plus.
- Anypoint MQ experience is must.
- Understand / apply reusable code design, leverage application architecture / patterns, framework capabilities and functionality, and design / develop solutions that are highly reliable, scalable, and perform to meet business-defined service levels.
- Experiences with Splunk/ELK is a plus.
- Experience in working in Micro services development preferably in API implementation in MuleSoft.
- Excellent communication skills
- Excellent interpersonal and analytical skills
- Excellent attention to detail ability
- Must be a quick learner to gear up in the MuleSoft technology and architectural standards.
Minimum 7-8 Years of Hands On Experience in Data Integration platform on cloud (Azure preferred).
Experience in developing ETL activities for Azure – Big data, relational databases, and data warehouse solutions.
Extensive hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, Azure Analysis Service, Azure Databricks, Azure Data Catalog, ML Studio, AI/ML, Snowflake, etc.
Well versed in DevOps and CI/CD deployments
Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, SSIS, etc.
Minimum of 2 years of RDBMS experience
Experience with private and public cloud architectures, pros/cons, and migration considerations.
Nice-to-Have Skills/Qualifications:
- DevOps on an Azure platform
- Experience developing and deploying ETL solutions on Azure
- IoT, event-driven, microservices, Containers/Kubernetes in the cloud
- Familiarity with the technology stack available in the industry for metadata management: Data Governance, Data Quality, MDM, Lineage, Data Catalog etc.
- Multi-cloud experience a plus - Azure, AWS, Google
Professional Skill Requirements:
Proven ability to build, manage and foster a team-oriented environment
Proven ability to work creatively and analytically in a problem-solving environment
Desire to work in an information systems environment
Excellent communication (written and oral) and interpersonal skills
Excellent leadership and management skills
Excellent organizational, multi-tasking, and time-management skills
(Kindly note this is not a Developmental role. Experience and/ or interest in Production Support/ Operations is mandatory.)
C2H position (Long Contract) with absolute potential of full-time recruitment with a brilliant company.
ESSENTIAL DUTIES AND RESPONSIBILITIES
Actively participate in the exchanging of ideas and information within the department
Participate in rollout of new business system integrations.
Serve as escalation point for system and troubleshooting issues, assist in resolving production issues.
Participate in or lead solution design sessions.
Ability to work effectively under pressure with changing priorities and tight deadlines
Provide technical advice, code reviews and assistance to other programmers.
Write complex integration solutions, optimized data extracts, updates and interfaces.
Interface with functional specialists to ensure solution design and integrity.
Devise sample input data to test accuracy of program.
Observe or runs test of program using sample or actual data, assist in user acceptance training.
Understand and embrace the global IT strategic direction
Adhere to all safety and health rules and regulations associated with this position and as directed by
superior.
Proactively organize verbal and written ideas clearly and use an appropriate business style
Be effective and consistent in providing regular updates to appropriate managers
Communication:
Ask questions; encourage input from team members
Able to work with and participate within a small multicultural professional team.
Provide regular updates to appropriate managers
Confer with reporting manager on complex or unusual situations
Maintain a thorough knowledge and in-depth field experience regarding emerging or job technologies
required to fulfill the job, this could include formal or self paced professional development. Participate in
the exchange of ideas and information within the department
Offer new ideas and suggestions for improvement. Identify and implement new practices and
processes that are “best in field”
Requirements:
KNOWLEDGE REQUIREMENTS
Experience in analysis, design, coding, and implementation of enterprise integration solutions.
Experience with webMethods components: Integration Server, Broker, Trading Networks, and SAP
Adapter. webMethods 10.x version preferred.
Experience with RosettaNet PIP’s and standards.
Experience with EDI Standards (ANSI X12) and AS2 protocols
Experience with XML/JSON and webservices (SOAP/REST).
Experience with PL/SQL.
Experience with FlatFile processing & SFTP
General understanding of scalability and high-availability systems.
Excellent communication/documentation skills
Experience in system administration a plus.
Experience with environment upgrading a plus (eg. version 9.X to 10.X)
EDUCATION & EXPERIENCE REQUIREMENTS
B.E./B.Tech/M.Tech/MCA
7 to 8 years experience in a relevant IT position OR equivalent external work experience
WORKING CONDITIONS
Location: Nodia
Travel requirements: Domestic and/or International, up to 10%.
Climate controlled office environment during normal business hours.
SAP B1 Developer
Candidate Profile:
- Degree or comparable in the computer science area Experience with ERP systems, ideally SAP.
- Experience with object-oriented development in Java and/ or C# and/ or Vb.net.
- Very good knowledge in front and backend web development (JavaScript, HTML5, CSS).
- Experience with web services (REST, SOAP).
- Database Application Knowledge (SQL)
- Knowledge in handling of data formats (JSON, XML)
- Experience with version control systems (GIT, TFVC)
● Good experience with Continuous integration and deployment tools like
Jenkins, Spinnaker, etc.
● Ability to understand problems and craft maintainable solutions.
● Working cross-functionally with a broad set of business partners to understand
and integrate their API or data flow systems with Xeno, so a minimal
understanding of data and API integration is a must.
● Experience with docker and microservice based architecture using
orchestration platforms like Kubernetes.
● Understanding of Public Cloud, We use Azure and Google Cloud.
● Familiarity with web servers like Apache, nginx, etc.
● Possessing knowledge of monitoring tools such as Prometheus, Grafana, New
Relic, etc.
● Scripting in languages like Python, Golang, etc is required.
● Some knowledge of database technologies like MYSQL and Postgres is
required.
● Understanding Linux, specifically Ubuntu.
● Bonus points for knowledge and best practices related to security.
● Knowledge of Java or NodeJS would be a significant advantage.
Initially, when you join some of the projects you’d get to own are:
● Audit and improve overall security of the Infrastructure.
● Setting up different environments for different sets of teams like
QA,Development, Business etc.
As a Partner Development Solution Architect focused on GSI partners within Aqua Security, you will have the opportunity to deliver on a strategy to build mind share and broad use of Aqua Platform across the partner community. Your broad responsibilities will include: owning the technical engagement with strategic partners, position aqua to be part of partner offerings, and assist with the creation of new technical strategies to help partners build and increase their application security practice business. You will be responsible for providing subject-matter expertise on the security of running cloud native workloads, which are rapidly being adopted in enterprise deployments. You will also drive technical relationships with all stakeholders and support sales opportunities. You will also work closely with the internal sales and partner sales team throughout the sales process to ensure all of the partners’ technical needs are understood and met with the best possible solution.
Responsibilities:
The ideal person will have excellent communications skills and be able to translate technical requirements for a non-technical audience. This person can multi-task, is self-motivated, while still interacting well with a team; is highly organized with high energy level and can-do attitude. Required skills include:
- Experience as a sales engineer or solution architect, working with enterprise software products or services.
- Ability to assess partner and customer requirements, identify business problems, and demonstrate proposed solutions.
- Ability to present at technical meetups.
- Ability to work with partners and conduct technical workshops
- Recent familiarity or hands-on experience with:
- Linux distributions, Windows Server
- Networking configurations, routing, firewalling
- DevOps eco-system: CI/CD tools, datacenter automation, open source tools like Jenkins
- Cloud computing environments (AWS, Azure, and Google Compute)
- Container technologies like Docker, Kubernetes, OpenShift and Mesos
-Knowledge of general security practices & DevSecOps
- Up to 25% travel is expected. The ideal candidate will be located in Hyderabad, India
Requirements:
- 7+ years of hands on implementation or consulting experience
- 3+ years in a customer and or partner facing roles
- Experience working with end users or developer communities
- Experience working effectively across internal and external organizations
- Knowledge of the software development lifecycle
- Strong verbal and written communications
- BS degree or equivalent experience required
- Data pre-processing, data transformation, data analysis, and feature engineering
- Performance optimization of scripts (code) and Productionizing of code (SQL, Pandas, Python or PySpark, etc.)
- Required skills:
- Bachelors in - in Computer Science, Data Science, Computer Engineering, IT or equivalent
- Fluency in Python (Pandas), PySpark, SQL, or similar
- Azure data factory experience (min 12 months)
- Able to write efficient code using traditional, OO concepts, modular programming following the SDLC process.
- Experience in production optimization and end-to-end performance tracing (technical root cause analysis)
- Ability to work independently with demonstrated experience in project or program management
- Azure experience ability to translate data scientist code in Python and make it efficient (production) for cloud deployment
Amazon Web Services (AWS) is carrying on that tradition while leading the world in Cloud technologies. As a member of the AWS Professional Services team you will be at the forefront of this transformational technology assisting a global list of companies that are taking advantage of a growing set of services and features to run their mission-critical applications.
Professional Services engage in a wide variety of projects for customers and partners, providing collective experience from across the AWS customer base and are obsessed about strong success for the Customer. Our team collaborates across the entire AWS organization to bring access to product and service teams, to get the right solution delivered and drive feature innovation based upon customer needs. You will collaborate with our customers and/or partners on key engagements and will develop and deliver proof-of-concept projects, technical workshops, feature comparisons, and execute migration projects.
You will be based in Hyderabad and might have to travel globally to ensure customer success.
Responsibilities :
- Employ customer facing skills to represent AWS well within the customer's environment and drive discussions with technical and business teams.
- As a key member of the team, ensure success in designing, building and migrating applications, software, and services on the AWS platform
- Participate in architectural discussions and design exercises to create large scale solutions built on AWS and also be part of the development lifecycle.
- Identity workarounds for specific issues and corner scenarios observed during migration
- Automate solutions for repeatable problems
- Develop test plan and testcases to demonstrate application/database readiness post migration
- Work closely with application teams to ensure business functionality and SLAs are met
- Consult for optimal design of database environments, analyzing complex distributed production deployments, and making recommendations to optimize performance
- Develop innovative solutions to complex business and technology problems
- Educate customers on the value proposition of AWS and AWS services
- Partner with the sales team to design solutions for customers that drive AWS adoption and revenue
- Conduct technical sessions for internal teams, partners and customers
BASIC QUALIFICATIONS :
- 12+ years of experience in a technical position.
- 4+ years on any Cloud Platform (AWS, Azure, Google etc).
- Bachelor's degree in Information Science / Information Technology, Computer Science, Engineering, Mathematics, Physics, or a related field.
- Strong verbal and written communication skills, with the ability to work effectively across internal and external organizations.
- Strong programming skills in Java and/or Python.
- Strong hands-on experience in integrating multiple databases like Oracle, SQL Server, PostgreSQL etc.
- Deep hands-on experience in the design, development and deployment of business software at scale.
- Customer facing skills to represent AWS well within the customer's environment and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation
- Leading/Involved in highly-available and fault-tolerant enterprise and web-scale software applications.
- Experience in performance optimization techniques.
- High end Troubleshooting and Communication skills.
- Proven experience with software development life cycle (SDLC) and agile/iterative methodologies required
PREFERRED QUALIFICATIONS :
- Implementing experience with primary AWS services (EC2, ELB, RDS, Lambda, API Gateway Route53 & S3).
- AWS Solutions Architect Certified
- Infrastructure automation through DevOps scripting (E.g. shell, Python, Ruby, Powershell, Perl)
- Configuration management using CloudFormation and/or Chef/Puppet
- Experience in database programming like PL/SQL etc.
- Demonstrated ability to think strategically about business, product, and technical challenges
- Integration of AWS cloud services with on-premise technologies from Microsoft, IBM, Oracle, HP, SAP etc.
- Experience with IT compliance and risk management requirements (eg. security, privacy, SOX, HIPAA etc.).
- Extended travel to customer locations may be required to sell and deliver professional services as needed
Amazon is an equal opportunity employer. Amazon or its Recruitment Partners do not charge any fee or security deposit from the candidate for offering employment.

