
Job Title - Data Entry Operator
Job Description -
1. Data Input and Management - Responsible for accurately inputting, updating, and maintaining data across multiple brands.
2. Quality Assurance - Perform regular checks to ensure data integrity and accuracy. This includes reviewing data for errors or inconsistencies, correcting any mistakes, and verifying the output.
3. Compliance and Confidentiality - Adhere to and comply with data integrity and security policies, maintaining confidentiality of information at all times.
4. Technical Proficiency - Proficient in using computer systems, databases, and software relevant to the company's data management needs.
Qualifications & Experience:
· Any Graduate/Undergraduate
· Experience minimum 4 – 8 Years.

About Burgundy Hospitality pvt ltd
About
Similar jobs

Job Responsibilities:
● Design, test, and build scalable backend python services
● Closely collaborate with marketing and product teams to build innovative, robust, and easy-to-use features that serve.
● Developing high-quality code based on detailed designs that cater to the product requirements.
● Responsible for troubleshooting, testing, and maintaining the core product software and databases to ensure strong optimization and functionality
Required Skills:
● Degree in Computer Science, Software Engineering or equivalent.
● Minimum 3+ years experience in software development.
● Expertise in Python 3.7, Django 2.2+, and REST APIs.
● Willingness to learn and ability to flourish in a dynamic, high-growth, entrepreneurial environment
● Hands-on, self-starter, capable of working independently
● True love for technology and what you do
● Maniacal attention to detail

About the Role:
We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.
Responsibilities:
- Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
- Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
- Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
- Team Management: Able to handle team.
- Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
- Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
- Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
- Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.
Skills:
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
- Understanding of data modeling and data architecture concepts.
- Experience with ETL/ELT tools and frameworks.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
Preferred Qualifications:
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
- Knowledge of machine learning and artificial intelligence concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Certification in cloud platforms or data engineering.
Role & responsibilities
- Strong written/verbal communication skills
- Minimum 5-12 Years of Core Java Programming with Collections Framework, Concurrent Programming, Multi-threading (Good knowledge in Executor service, Forkjoin pool and other threading concepts)
- Good knowledge of the JVM with an understanding of performance and memory optimization.
- Extensive and expert programming experience in JAVA programming language (strong OO skills preferred).
- Excellent knowledge on collections like, Array List, Vector, LinkedList, Hashmap, HashTable, HashSet is mandate.
- Exercised exemplary development practices including design specification, coding standards, unit testing, and code-reviews.
- Expert level understanding of Object Oriented Concepts and Data Structures
- Good experience in Database (Sybase, Oracle or SQL Server) like indexing (clustered, non clustered), hashing, segmenting, data types like clob / blob, views (materialized), replication, constraints, functions, triggers, procedures etc.
Desired Competencies (Technical/Behavioral Competency)
- . Strong knowledge of Splunk architecture, components, and deployment models (standalone, distributed, or clustered)
- Hands-on experience with Splunk forwarders, search processing, and index clustering
- Proficiency in writing SPL (Search Processing Language) queries and creating dashboards
- Familiarity with Linux/Unix systems and basic scripting (e.g., Bash, Python)
- Understanding of networking concepts and protocols (TCP/IP, syslog)
Key Responsibilities
- Deploy Splunk Enterprise or Splunk Cloud on servers or virtual environments.
- Configure indexing and search head clusters for data collection and search functionalities.
- Deploy universal or heavy forwarders to collect data from various sources and send it to the Splunk environment
- Configure data inputs (e.g., syslogs, snmp, file monitoring) and outputs (e.g., storage, dashboards)
- Identify and onboard data sources such as logs, metrics, and events.
- Use regular expressions or predefined methods to extract fields from raw data
- Configure props.conf and transforms.conf for data parsing and enrichment.
- Create and manage indexes to organize and control data storage.
- Configure roles and users with appropriate permissions using role-based access control (RBAC).
- Integrate Splunk with external authentication systems like LDAP, SAML, or Active Directory
- Monitor user activities and changes to the Splunk environment
- Optimize Splunk for better search performance and resource utilization
- Regularly monitor the status of indexers, search heads, and forwarders
- Configure backups for configurations and indexed data
- Diagnose and resolve issues like data ingestion failures, search slowness, or system errors.
- Install and manage apps and add-ons from Splunkbase or custom-built solutions.
- Create python scripts for automation and advanced data processing.
- Integrate Splunk with ITSM tools (e.g., ServiceNow), monitoring tools, or CI/CD pipelines.
- Use Splunk's REST API for automation and custom integrations
- Good to have Splunk Core Certified Admin certification
Splunk Development and Administration
- Build and optimize complex SPL (Search Processing Language) queries for dashboards, reports, and alerts.
- Develop and manage Splunk apps and add-ons, including custom Python scripts for data ingestion and enrichment.
- Onboard and validate data sources in Splunk, ensuring proper parsing, indexing, and field extractions.
Job Description:
We are seeking a creative and dynamic individual to join our team as a YouTube Content Creator. As a Content Creator, you will be responsible for producing engaging and high-quality video content for our YouTube channel. This role requires a combination of creativity, technical proficiency, and a deep understanding of online video trends.
Role & Responsibilities:
Content Creation:
- Plan, script, shoot, and edit compelling video content for our YouTube channel.
- Develop innovative and entertaining concepts that resonate with our target audience.
- Stay informed about industry trends and competitor content to ensure our channel remains relevant and competitive.
Video Editing:
- Edit videos to enhance visual appeal, storytelling, and overall quality.
- Incorporate graphics, music, and other elements to elevate the production value of each video.
- Ensure all content meets brand guidelines and quality standards.
Audience Engagement:
- Monitor and respond to audience comments, feedback, and trends.
- Collaborate with the marketing team to develop strategies for audience growth and engagement.
- Create content that encourages viewer interaction and participation.
Collaboration:
- Work closely with cross-functional teams, including marketing, sales, and customer support, to align social media efforts with overall business objectives.
- Collaborate with influencers and partners to enhance content reach and diversity.
Analytics and Optimization:
- Conduct keyword research and implement SEO best practices to enhance video discoverability and reach a broader audience.
- Monitor YouTube algorithm changes and adapt strategies to maximize content visibility.
- Analyze video performance metrics and audience insights.
Preferred Candidate Profile:
- Proven experience as a YouTube Content Creator or similar role.
- Proficiency in video editing software (e.g., Adobe Premiere Pro, Final Cut Pro).
- Strong understanding of YouTube platform, trends, and best practices.
- Creative mindset with the ability to generate innovative and engaging content ideas.
- Excellent communication and interpersonal skills.
- Ability to meet deadlines and work effectively in a fast-paced environment.
- Familiarity with SEO and keyword optimization for YouTube.
About Company:
TalkCharge is a digital payment and marketing platform. We are facilitating our 2 Million+ users with Online Recharges, Bill payments, GiftCards and a comprehensive listing of Discount Coupons (affiliates) and Deals.
MUST have skills (not just a class room training experience please) –
- Hands on experience on Postman/ SOAP UI/ REST ASSURED/JUnit framework for API testing
- Hands on experience on GIT, JIRA, Jenkins
- Good Understanding of REST API types (GET, PUT, POST, DELETE) an it’s working
- Good understanding of HTTP, JSON protocol and syntax
- Good understanding of Message Schema, RAML, Message Request-Response mechanism
- Working experience in Agile methodology.
- Strong written and verbal communication skills (English).
- The candidate should be good with communication as he/she will be interacting with global teams very often.
GOOD to have –
- Understanding of STUB/Service virtualization
- Understanding of any API testing tools like –Rest Client, Anypoint etc.
- Been able to understand Retail Banking functions/requirements.
Key Activities
- Primary responsibility for test automation within Agile delivery streams
- Experience with building test automation frameworks and Continued Integration
• Some amount of manual exploratory testing
• Test preparation, test design, execution and reporting
• Defect management lifecycle exposure (using Jira) from defect reporting to tracking to closure
• Writing test plans and test completion reports
• Active in Agile meetings when required for planning, retrospectives

fulfill the job duties, certifications, years of experience, degree)
• 3 - 5 years’ experience in a highly technical role at hyper-growth startup or fast-paced
company
• Extensive background designing, developing, testing, deploying, maintaining, and
improving software
• Demonstrable experience architecting scalable and cost effective solutions to ensure
and support the customer growth
• Experience mentoring junior engineers in all aspects of planning, development and
testing
• Demonstrated ability to translate business goals and initiatives into technical
requirements
• Strong understanding of system design and architecture.
• Interest in engaging with latest technologies and evaluating strategies to keep the
Novo technology stack up to date
• Excellent written and verbal communication skills with the ability to collaborate with
both technical and non-technical teams effectively.
• Work across our tech stack which includes:
o Node.js and Go for our application code
o React for our frontend code
o GraphQL for communication between systems
o Docker for running our services
o PostgreSQL for persistent data storage
Nice To Have, but Not Required:
• Experience with Go programming language
• Experience working in a startup en
- IMMEDIATE JOINEE
BA/BS degree in Computer Science or related technical field or equivalent work experience. MS degree is preferred. - 9+ years development experience in JAVA (using OO Design and Analysis, Design Patterns, etc.)
- High experience in Java 8 with hands on coding Required.
- 4+ years’ experience building highly scalable, distributed and reliable Restful Web Services using Spring, Jersey, etc.
- Should have knowledge on Cloud applications preferably on AWS.
- Knowledge on building Front end applications using JavaScript, AngularJS, HTML5, CSS3.
- Experience with relational DBs like MySQL and NoSQL Database like MongoDB, Solr/Lucene will be a plus.
- Experience with Tomcat, Jax-RS, REST, JPA, IntelliJ, Groovy, Hibernate is must.
- Experience with build technologies like Gradle, Jenkins will be a plus.
- Experience with Test Driven Development using TestNG/Junit testing frameworks.
- Experience working with version control like GIT.
- 8+ years (or more) experience shipping quality code
- 4+ years’ experience with leading frontend frameworks like Angular 1 to 8 or ReactJS, etc.
- Experience on Ionic Framework, Cardova
- Experience contributing front-end code to a publicly available, consumer-focused web application.
- Solid experience and understanding of HTML5, CSS3, and Javascript (vanilla JS and popular libraries like Underscore.js and jQuery)
- Experience with REST-based APIs


