
Job Summary:
Join our team as an Entry Level Data Entry Clerk/Typing specialist, working remotely to contribute to the efficiency and success of our operations. As a crucial member of our team, you will play a key role in maintaining accurate and organized records, ensuring seamless data management, and supporting the overall productivity of the organization.
Key Responsibilities:
• * Accurate and timely data entry into the company systems.
• * Typing and proofreading various documents to maintain high-quality standards.
• * Organizing and categorizing data for easy retrieval.
• * Collaborating with team members to ensure data accuracy and consistency.
• * Adhering to company policies and procedures for data management.
Required Skills:
• * Proficient typing skills with a minimum typing speed of [XX words per minute].
• * Attention to detail and a commitment to data accuracy.
• * Strong organizational skills to manage and prioritize tasks effectively.
• * Excellent communication skills for collaboration within a remote team.
• * Familiarity with data entry software and systems.
Qualifications:
• * High school diploma or equivalent.
• * Previous experience in data entry or typing roles is a plus but not mandatory.
• * Ability to work independently in a remote setting.

About Concorde Inc
About
Company social profiles
Similar jobs
We are looking for a Technical Lead - GenAI with a strong foundation in Python, Data Analytics, Data Science or Data Engineering, system design, and practical experience in building and deploying Agentic Generative AI systems. The ideal candidate is passionate about solving complex problems using LLMs, understands the architecture of modern AI agent frameworks like LangChain/LangGraph, and can deliver scalable, cloud-native back-end services with a GenAI focus.
Key Responsibilities :
- Design and implement robust, scalable back-end systems for GenAI agent-based platforms.
- Work closely with AI researchers and front-end teams to integrate LLMs and agentic workflows into production services.
- Develop and maintain services using Python (FastAPI/Django/Flask), with best practices in modularity and performance.
- Leverage and extend frameworks like LangChain, LangGraph, and similar to orchestrate tool-augmented AI agents.
- Design and deploy systems in Azure Cloud, including usage of serverless functions, Kubernetes, and scalable data services.
- Build and maintain event-driven / streaming architectures using Kafka, Event Hubs, or other messaging frameworks.
- Implement inter-service communication using gRPC and REST.
- Contribute to architectural discussions, especially around distributed systems, data flow, and fault tolerance.
Required Skills & Qualifications :
- Strong hands-on back-end development experience in Python along with Data Analytics or Data Science.
- Strong track record on platforms like LeetCode or in real-world algorithmic/system problem-solving.
- Deep knowledge of at least one Python web framework (e.g., FastAPI, Flask, Django).
- Solid understanding of LangChain, LangGraph, or equivalent LLM agent orchestration tools.
- 2+ years of hands-on experience in Generative AI systems and LLM-based platforms.
- Proven experience with system architecture, distributed systems, and microservices.
- Strong familiarity with Any Cloud infrastructure and deployment practices.
- Should know about any Data Engineering or Analytics expertise (Preferred) e.g. Azure Data Factory, Snowflake, Databricks, ETL tools Talend, Informatica or Power BI, Tableau, Data modelling, Datawarehouse development.
About the Role
We are actively seeking talented Senior Python Developers to join our ambitious team dedicated to pushing the frontiers of AI technology. This opportunity is tailored for professionals who thrive on developing innovative solutions and who aspire to be at the forefront of AI advancements. You will work with different companies in the US who are looking to develop both commercial and research AI solutions.
Required Skills:
- Write effective Python code to tackle complex issues
- Use business sense and analytical abilities to glean valuable insights from public databases
- Clearly express the reasoning and logic when writing code in Jupyter notebooks or other suitable mediums
- Extensive experience working with Python
- Proficiency with the language's syntax and conventions
- Previous experience tackling algorithmic problems
- Nice to have some prior Software Quality Assurance and Test Planning experience
- Excellent spoken and written English communication skills
The ideal candidates should be able to
- Clearly explain their strategies for problem-solving.
- Design practical solutions in code.
- Develop test cases to validate their solutions.
- Debug and refine their solutions for improvement.
CTC Budget: 35-55LPA
Location: Hyderabad (Remote after 3 months WFO)
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
- 6 plus years of experience as a Python developer.
- Experience in web development using Python and Django Framework.
- Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
- Experience in developing User Interface using HTML, JavaScript, CSS.
- Experience in server-side templating languages including Jinja 2 and Mako
- Knowledge into Kafka and RabitMQ (GTH)
- Experience into Docker, Git and AWS
- Ability to integrate multiple data sources into a single system.
- Ability to collaborate on projects and work independently when required.
- DB (MySQL, Postgress, SQL)
Selection Process: 2-3 Interview rounds (Tech, VP, Client)
· The Objective:
You will play a crucial role in designing, implementing, and maintaining our data infrastructure, run tests and update the systems
· Job function and requirements
o Expert in Python, Pandas and Numpy with knowledge of Python web Framework such as Django and Flask.
o Able to integrate multiple data sources and databases into one system.
o Basic understanding of frontend technologies like HTML, CSS, JavaScript.
o Able to build data pipelines.
o Strong unit test and debugging skills.
o Understanding of fundamental design principles behind a scalable application
o Good understanding of RDBMS databases among Mysql or Postgresql.
o Able to analyze and transform raw data.
· About us
Mitibase helps companies find warm prospects every month that are most relevant, and then helps their team to act on those with automation. We do so by automatically tracking key accounts and contacts for job changes and relationships triggers and surfaces them as warm leads in your sales pipeline.
Who We are?
At e6data (https://e6x.io/" target="_blank">https://e6data.io) we’re building core algorithms that make querying 25 - 200X more performant (i.e. faster and/or cheaper) than every other analytics platform in the world. This results in billions of dollars of value unlocked across faster analytics, savings on computing/infra, and new use cases become possible.
Our core team has 100+ years of combined experience and 10+ patents across diverse but complementary disciplines like real-time databases, time-series databases, SQL OLAP engines, distributed computing platforms, stream processing, and log analytics. Two of the founding team are second-time founders with past exits.
We are passionate about solving the deepest and most important problems in computer science and believe in making our technology available to customers regardless of their location, size, or budget. Our vision is for e6data to power every analytical workload in the world - regardless of deployment scenario (cloud, on-premise, edge), across every use case (end-user queries, ETL / data pipelines), and every latency requirement (batch, streaming / real-time).
What you will do?
- Write awesome code
- Your primary focus will be on building our product around our core algorithms. You will collaborate with our Performance Engineering and DevOps teams.
- Identify tactical and long-term product improvements (could be code, script, etc.) so that e6data development is based on frictionless onboarding of customers
- Build our internal Data Platform using e6data
- Adding scalable functionality to the core e6data engine for new features like UDFs, SQL functions, etc
What we are looking for?
- Worked with Java for a minimum of 4 years
- Knowledge of SQL Query planning will also be good to have
- A passion for continuous learning and growth (both at a personal and team level)
- Strong team player
- Experience (or enjoys) working in a startup will be an added advantag
Responsibilities
- Build and mentor the platform team at Checko.
- Own the design, development, testing, deployment, and craftsmanship of the team’s infrastructure and systems capable of handling massive amounts of requests with high reliability and scalability
- Leverage the deep and broad technical expertise to mentor engineers and provide leadership on resolving complex technology issues
- Entrepreneurial and out-of-box thinking essential for a technology startup
- Guide the team for unit-test code for robustness, including edge cases, usability, and general reliability
Requirements
- Must have design, development, testing, deployment of systems capable of handling massive amounts of requests with high reliability and scalability
- Must have strong command in writing production-level code in Java or Python including skills in debugging, performance analysis/optimization and memory usage optimization
- Must have worked with real-time web/mobile applications and event-driven architectures
- Must have experience working with relational and non-relational databases and understanding their data models and performance tradeoffs.
- Must have solid engineering principles and a clear understanding of data structures and algorithms
- Should have knowledge of service-oriented architecture, caching techniques, micro-services, and distributed systems
- Should have basic understanding of C++/reactJS/Angular/Node
Desired Skills and Experience
Algorithms, debugging, performance optimization on low-end processors, data structures, REST, service-oriented architecture.
The main roles and responsibilities of a Power BI developer are discussed below:
- Power BI development and administration.
- Building Analysis Services reporting models.
- Developing visual reports, dashboards and KPI scorecards using Power BI desktop.
- Connecting to data sources, importing data and transforming data for Business Intelligence.
- Excellent in analytical thinking for translating data into informative visuals and reports.
- Able to implement row level security on data and have an understanding of application security layer models in https://powerbi.microsoft.com/en-us/">Power BI.








