
Excellent skills in writing about complex technical and business subjects to a variety of audiences.
ü Be able to think like a user and create intuitive documentation as and when needed to create succinct documentation
ü Be able to understand product thoroughly and interact with developers
ü Ability to interact with different business groups and scope documentation assignments.
ü Preferable exposure to eCommerce / Retail Analytics
ü Precise documentation and clear communication skills
ü Experience in API documentation is an added bonus
Skills and Experience
ü Experience in content review and localization
ü Ability to conceptualize and execute projects, with a strong bias for action and the ability to prioritize and meet deadlines
ü Experience with content management systems like Wordpress
ü Experience working in an Agile content development environment
ü Experience in editorial/peer review
ü Knowledge about eCommerce tools and plugins will be an advantage

Similar jobs

5-7 years of experience in Data Engineering with solid experience in design, development and implementation of end-to-end data ingestion and data processing system in AWS platform.
2-3 years of experience in AWS Glue, Lambda, Appflow, EventBridge, Python, PySpark, Lake House, S3, Redshift, Postgres, API Gateway, CloudFormation, Kinesis, Athena, KMS, IAM.
Experience in modern data architecture, Lake House, Enterprise Data Lake, Data Warehouse, API interfaces, solution patterns, standards and optimizing data ingestion.
Experience in build of data pipelines from source systems like SAP Concur, Veeva Vault, Azure Cost, various social media platforms or similar source systems.
Expertise in analyzing source data and designing a robust and scalable data ingestion framework and pipelines adhering to client Enterprise Data Architecture guidelines.
Proficient in design and development of solutions for real-time (or near real time) stream data processing as well as batch processing on the AWS platform.
Work closely with business analysts, data architects, data engineers, and data analysts to ensure that the data ingestion solutions meet the needs of the business.
Troubleshoot and provide support for issues related to data quality and data ingestion solutions. This may involve debugging data pipeline processes, optimizing queries, or troubleshooting application performance issues.
Experience in working in Agile/Scrum methodologies, CI/CD tools and practices, coding standards, code reviews, source management (GITHUB), JIRA, JIRA Xray and Confluence.
Experience or exposure to design and development using Full Stack tools.
Strong analytical and problem-solving skills, excellent communication (written and oral), and interpersonal skills.
Bachelor's or master's degree in computer science or related field.

POST - SENIOR DATA ENGINEER WITH AWS
Experience : 5 years
Must-have:
• Highly skilled in Python and PySpark
• Have expertise in writing Glue jobs ETL script, AWS
• Experience in working with Kafka
• Extensive SQL DB experience – Postgres
Good-to-have:
• Experience in working with data analytics and modelling
• Hands on Experience of PowerBI visualization tool
• Knowledge and hands-on on version control system - Git Common:
• Excellent communication and presentation skills (written and verbal) to all levels
of an organization
• Should be results oriented with ability to prioritize and drive multiple initiatives to
complete work you're doing on time
• Proven ability to influence a diverse geographically dispersed group of
individuals to facilitate, moderate, and influence productive design and implementation
discussions driving towards results
Shifts - Flexible ( might have to work as per US Shift timings for meetings ).
Employment Type - Any


We are looking for a good JavaScript developer who is proficient with React.js. Your primary focus will be on developing user interface components and implementing them following well-known React.js workflows (such as Flux or Redux). You will ensure that these components and the overall application are robust and easy to maintain. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important.
Requirements
- 3+ years of experience in React JS and Redux
- Expert in React.j, ideally using TypeScript language extensions
- Good understanding of JavaScript Design Patterns
- Good experience writing front-end test cases
- Exposure in React Native is preferred.
- HTML5, CSS3 and Responsive web design are mandatory
- Exposure to Scrum methodology and XP technical practices such as unit testing, pair programming, test-driven development, continuous integration or continuous delivery
- Self-motivated, fast learner, detail-oriented, team player and a sense of humour


Dot net Developer
Experience:
2 to 5 years
cost:
12 LPA
Mumbai, Bangalore, Chennai & Delhi-NCR

Ruby on Rails Developer Responsibilities:
- Designing and developing new web applications.
- Maintaining and troubleshooting existing web applications.
- Writing and maintaining reliable Ruby code.
- Integrating data storage solutions.
- Creating back-end components.
- Identifying and fixing bottlenecks and bugs.
- Integrating user-facing elements designed by the front-end team.
- Connecting applications with additional web servers.
- Maintaining APIs.
Ruby on Rails Developer Requirements:
- Bachelor’s degree in computer science, computer engineering, or related field.
- Experience working with ruby on rails as well as libraries like Resque and RSpec.
- Ability to write clean ruby code.
- Proficiency with code versioning tools including Git, Github, SVN, and Mercurial.
- Experience with AngularJS or BackboneJS.
- Familiarity with MVC, Mocking, RESTful, and ORM.
- Good understanding of front-end technologies including HTML5, JavaScript, and CSS3.
- Knowledge of server-side templating languages including Slim and Liquid.
- Familiarity with testing tools.
Job Description:
- Bookkeeping and accounting in Tally ERP, Xero, QuickBooks, and applicable accounting software
- Responsible for preparation and management of books of accounts, records, and documents for foreign entities
- Preparation and reporting of Monthly/periodical MIS.
- Managing billing, receivables, and collection.
- Liaising with foreign consultants with respect to Bookkeeping, compliances
- Ensure compliance under various laws for payroll and non-payroll compliances.
- Managing Audits of the offshore entities under different statutes (GST/Sales Tax, Companies House)
- Managing payroll and payroll compliances
- Managing Banking operations and payments and operational fund flow/cash flow.
- Desired Candidate Profile:
- Must have good communication skills to deal with foreign clients.
- Should have good knowledge of MS office and tally.
- Experience in Corporate Reporting, MIS, Power BI and Tableau etc.
• The Professional Services Implementation Engineer is a customer-facing role responsible for the implementation of Acqueon products.
• The successful candidate is enthusiastic and can easily communicate at all levels from business users to technical engineers.
• You will primarily work remotely but there’s opportunities to work onsite too.
• You will be a consultant and product SME as you interface with end users to assess their current processes and gather their requirements.
• You will use what you have learned to implement, configure, and test the Acqueon software in the customer’s environment.
• Understand omni channel communication technologies and their role in the contact center
• Good listening and comprehension skills.
• A natural ability to dig in and resolve technical issues in structured manner
• Have a commitment to excellence in taking care of our customers and expect the same from others
• Experience with deploying products in cloud platforms (AWS, Azure, etc.)
• Experience working with one or more Contact Center software suites such as Cisco UCCE, Amazon Connect, Nice InContact, Twilio, Avaya, or Genesys
• Previous experience with outbound contact center products is a plus
JD for IOT DE:
The role requires experience in Azure core technologies – IoT Hub/ Event Hub, Stream Analytics, IoT Central, Azure Data Lake Storage, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight / Databricks, SQL data warehouse.
You Have:
- Minimum 2 years of software development experience
- Minimum 2 years of experience in IoT/streaming data pipelines solution development
- Bachelor's and/or Master’s degree in computer science
- Strong Consulting skills in data management including data governance, data quality, security, data integration, processing, and provisioning
- Delivered data management projects with real-time/near real-time data insights delivery on Azure Cloud
- Translated complex analytical requirements into the technical design including data models, ETLs, and Dashboards / Reports
- Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
- Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
- Successfully delivered large scale IOT data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
- Experience in handling telemetry data with Spark Streaming, Kafka, Flink, Scala, Pyspark, Spark SQL.
- Hands-on experience on containers and Dockers
- Exposure to streaming protocols like MQTT and AMQP
- Knowledge of OT network protocols like OPC UA, CAN Bus, and similar protocols
- Strong knowledge of continuous integration, static code analysis, and test-driven development
- Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
- Must have excellent analytical and problem-solving skills
- Delivered change management initiatives focused on driving data platforms adoption across the enterprise
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
Roles & Responsibilities
You Will:
- Translate functional requirements into technical design
- Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core Azure services needed to fulfill the technical design
- Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
- Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
- Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapse, or SQL
- Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
- Automate core activities to minimize the delivery lead times and improve the overall quality
- Optimize platform cost by selecting the right platform services and architecting the solution in a cost-effective manner
- Deploy Azure DevOps and CI CD processes
- Deploy logging and monitoring across the different integration points for critical alerts

