
Job Summary:
Join our team as an Entry Level Data Entry Clerk/Typing specialist, working remotely to contribute to the efficiency and success of our operations. As a crucial member of our team, you will play a key role in maintaining accurate and organized records, ensuring seamless data management, and supporting the overall productivity of the organization.
Key Responsibilities:
• * Accurate and timely data entry into the company systems.
• * Typing and proofreading various documents to maintain high-quality standards.
• * Organizing and categorizing data for easy retrieval.
• * Collaborating with team members to ensure data accuracy and consistency.
• * Adhering to company policies and procedures for data management.
Required Skills:
• * Proficient typing skills with a minimum typing speed of [XX words per minute].
• * Attention to detail and a commitment to data accuracy.
• * Strong organizational skills to manage and prioritize tasks effectively.
• * Excellent communication skills for collaboration within a remote team.
• * Familiarity with data entry software and systems.
Qualifications:
• * High school diploma or equivalent.
• * Previous experience in data entry or typing roles is a plus but not mandatory.
• * Ability to work independently in a remote setting.

About Concorde Inc
About
Company social profiles
Similar jobs
We are looking for a Technical Lead - GenAI with a strong foundation in Python, Data Analytics, Data Science or Data Engineering, system design, and practical experience in building and deploying Agentic Generative AI systems. The ideal candidate is passionate about solving complex problems using LLMs, understands the architecture of modern AI agent frameworks like LangChain/LangGraph, and can deliver scalable, cloud-native back-end services with a GenAI focus.
Key Responsibilities :
- Design and implement robust, scalable back-end systems for GenAI agent-based platforms.
- Work closely with AI researchers and front-end teams to integrate LLMs and agentic workflows into production services.
- Develop and maintain services using Python (FastAPI/Django/Flask), with best practices in modularity and performance.
- Leverage and extend frameworks like LangChain, LangGraph, and similar to orchestrate tool-augmented AI agents.
- Design and deploy systems in Azure Cloud, including usage of serverless functions, Kubernetes, and scalable data services.
- Build and maintain event-driven / streaming architectures using Kafka, Event Hubs, or other messaging frameworks.
- Implement inter-service communication using gRPC and REST.
- Contribute to architectural discussions, especially around distributed systems, data flow, and fault tolerance.
Required Skills & Qualifications :
- Strong hands-on back-end development experience in Python along with Data Analytics or Data Science.
- Strong track record on platforms like LeetCode or in real-world algorithmic/system problem-solving.
- Deep knowledge of at least one Python web framework (e.g., FastAPI, Flask, Django).
- Solid understanding of LangChain, LangGraph, or equivalent LLM agent orchestration tools.
- 2+ years of hands-on experience in Generative AI systems and LLM-based platforms.
- Proven experience with system architecture, distributed systems, and microservices.
- Strong familiarity with Any Cloud infrastructure and deployment practices.
- Should know about any Data Engineering or Analytics expertise (Preferred) e.g. Azure Data Factory, Snowflake, Databricks, ETL tools Talend, Informatica or Power BI, Tableau, Data modelling, Datawarehouse development.
About the Role
We are actively seeking talented Senior Python Developers to join our ambitious team dedicated to pushing the frontiers of AI technology. This opportunity is tailored for professionals who thrive on developing innovative solutions and who aspire to be at the forefront of AI advancements. You will work with different companies in the US who are looking to develop both commercial and research AI solutions.
Required Skills:
- Write effective Python code to tackle complex issues
- Use business sense and analytical abilities to glean valuable insights from public databases
- Clearly express the reasoning and logic when writing code in Jupyter notebooks or other suitable mediums
- Extensive experience working with Python
- Proficiency with the language's syntax and conventions
- Previous experience tackling algorithmic problems
- Nice to have some prior Software Quality Assurance and Test Planning experience
- Excellent spoken and written English communication skills
The ideal candidates should be able to
- Clearly explain their strategies for problem-solving.
- Design practical solutions in code.
- Develop test cases to validate their solutions.
- Debug and refine their solutions for improvement.
CTC Budget: 35-55LPA
Location: Hyderabad (Remote after 3 months WFO)
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
- 6 plus years of experience as a Python developer.
- Experience in web development using Python and Django Framework.
- Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
- Experience in developing User Interface using HTML, JavaScript, CSS.
- Experience in server-side templating languages including Jinja 2 and Mako
- Knowledge into Kafka and RabitMQ (GTH)
- Experience into Docker, Git and AWS
- Ability to integrate multiple data sources into a single system.
- Ability to collaborate on projects and work independently when required.
- DB (MySQL, Postgress, SQL)
Selection Process: 2-3 Interview rounds (Tech, VP, Client)
Who We are?
At e6data (https://e6x.io/" target="_blank">https://e6data.io) we’re building core algorithms that make querying 25 - 200X more performant (i.e. faster and/or cheaper) than every other analytics platform in the world. This results in billions of dollars of value unlocked across faster analytics, savings on computing/infra, and new use cases become possible.
Our core team has 100+ years of combined experience and 10+ patents across diverse but complementary disciplines like real-time databases, time-series databases, SQL OLAP engines, distributed computing platforms, stream processing, and log analytics. Two of the founding team are second-time founders with past exits.
We are passionate about solving the deepest and most important problems in computer science and believe in making our technology available to customers regardless of their location, size, or budget. Our vision is for e6data to power every analytical workload in the world - regardless of deployment scenario (cloud, on-premise, edge), across every use case (end-user queries, ETL / data pipelines), and every latency requirement (batch, streaming / real-time).
What you will do?
- Write awesome code
- Your primary focus will be on building our product around our core algorithms. You will collaborate with our Performance Engineering and DevOps teams.
- Identify tactical and long-term product improvements (could be code, script, etc.) so that e6data development is based on frictionless onboarding of customers
- Build our internal Data Platform using e6data
- Adding scalable functionality to the core e6data engine for new features like UDFs, SQL functions, etc
What we are looking for?
- Worked with Java for a minimum of 4 years
- Knowledge of SQL Query planning will also be good to have
- A passion for continuous learning and growth (both at a personal and team level)
- Strong team player
- Experience (or enjoys) working in a startup will be an added advantag
Job Description :
We are currently building a next generation consumer tech product at the intersection of AI ML through Mobile and Commerce. The product requires building proprietary frameworks using latest technologies.
We have an urgent requirement of a Backend Developer/ Fresher/ Intern, who would become core member of our Product Team, is a great problem solver, can learn quickly, and communicate clearly.
Must Have Skill Set :
-
Efficient software development background including design patterns, data structures, test driven development
-
Individual contributor with strong analytical skills, with excellent problem solving abilities. Excellent verbal and written communication skills
-
Should be able to work under tight deadlines and stretch as and when required
-
Self starter with the ability to research solutions independently and solve problems Strong, object oriented design and coding skills, c++/Java on a Linux platform Preferably with internship experience of Product based companies Working on products with strong Knowledge of SpringBoot and REST API frameworks Strong knowledge of Data structures and Algorithms
-
Hands on expertise in Database/Datastores - MongoDB, MYSQL, Elastic Search, Redis and Kafka will be a big plus
Key Responsibility :
-
Tasks will include complete responsibility for application design, code development, testing, deployment and maintenance
-
Develop code as part of an agile team to deliver new features enhancements Design and develop REST APIs
-
Follow approved life cycle methodologies, create design documents, and perform program coding and testing Develop plans outlining steps and time tables for developing programs and communicate plans and status to management
-
Work independently or with minimal supervision and utilize knowledge, experience and judgment to accomplish well defined goals
-
You should be at ease with maintaining cloud instances on AWS and other cloud services
-
You have experience in identifying, debugging and resolving complex production issues End to End ownership of various applications and microservices
-
Architect, design, develop, deploy and operate services that serve millions of users
What You Get :
1. You work with a team of passionate folks who love design and technology 2. Immense learning and growth opportunities in small, multi functional, tightly
knit teams
3. Flexible work timings; 5 days a week
4. Accelerated learning
5. Huge Responsibility early in your career which will come handy in future
6. Ownership and independence
Qualifications :
B.Tech/ MCA in Computer Science or equivalent
About QuHu
QuHu is an audio first content platform where users can create, share & distribute their content and the discovery happens via a feed based on interest graph. We aim to build an audio ecosystem with a global presence consisting of audio-based social networks and audio communities.
At QuHu, we want to cultivate a vibrant and growing online UGC audio community and interactive audio entertainment platform where users are encouraged to create, share, discover and enjoy audio, and experience immersive and diversified entertainment features through audio. QuHu envisions a global audio ecosystem – a place where everyone can be connected through voices and across cultures.
The two pillars around which QuHu is built are:
Users - for whom we want to solve the problem of discovery basis interest (algorithmically driven feed) and provide an impressive experience through audio- bites (short content), live audio, audio rooms & audio games.
Creators - for whom we want to make creation & editing a seamless process through a suite of 'creator tools' like sound mixing, trim, pitch shifter, voicemojis, voice filters, noise reduction, etc & help them monetise
The main roles and responsibilities of a Power BI developer are discussed below:
- Power BI development and administration.
- Building Analysis Services reporting models.
- Developing visual reports, dashboards and KPI scorecards using Power BI desktop.
- Connecting to data sources, importing data and transforming data for Business Intelligence.
- Excellent in analytical thinking for translating data into informative visuals and reports.
- Able to implement row level security on data and have an understanding of application security layer models in https://powerbi.microsoft.com/en-us/">Power BI.







