11+ Data extraction Jobs in Ahmedabad | Data extraction Job openings in Ahmedabad
Apply to 11+ Data extraction Jobs in Ahmedabad on CutShort.io. Explore the latest Data extraction Job opportunities across top companies like Google, Amazon & Adobe.
Role - MIS Executive
Experience - 2 to 4 years
Job Location - Ahmedabad
- Maintenance of existing management information systems.
- Generate and distribute management reports in an accurate and timely manner.
- Use Advanced Excel capabilities, including pivot tables, look-ups, complex formulas and graphing to streamline business processes.
- Ability to understand complex data, analyze and make reports and dashboards
- Extract the data from the designated portal and update it.
- Provide recommendations to update current MIS to improve reporting efficiency and consistency.
- Perform data analysis for generating reports on a periodic basis.
- Provide strong reporting and analytical information support to the management team.
- Generate both periodic and ad hoc reports as required.
- Analyze business information to identify process improvements for increasing business efficiency and effectiveness.
- Provide support and assistance to management in issue troubleshooting and resolution.
- Handling database management by using Advanced Excel tools & MS Access
- Should be proficient with Advanced Excel Formulas such as, Pivot Table, Lookups, Index Formatting, Conditional Formatting.
- Exp in Tableau, Dashboard creation & Data crunching & extraction
- Qualification: Bachelor's degree with experience of 2 - 4 Years
- 2 to 4 Years experience in MIS and Dashboarding is a must.
Position: Lead Python Developer
Location: Ahmedabad, Gujarat
The Client company includes a team of experienced information services professionals who are passionate about growing and enhancing the value of information services businesses. They provide support with talent, technology, tools, infrastructure and expertise required to deliver across the Data ecosystem. Position Summary We are seeking a skilled and experienced Backend Developer with strong expertise in TypeScript, Python, and web scraping. You will be responsible for designing, developing, and maintaining scalable backend services and APIs that power our data-driven products. Your role will involve collaborating with cross functional teams, optimizing system performance, ensuring data integrity, and contributing to the design of efficient and secure architectures.
Job Responsibility
● Design, develop, and maintain backend systems and services using Python and TypeScript.
● Develop and maintain web scraping solutions to extract, process, and manage large-scale data from multiple sources.
● Work with relational and non-relational
databases, ensuring high availability, scalability, and performance.
● Implement authentication, authorization, and security best practices across services.
● Write clean, maintainable, and testable code following best practices and coding standards.
● Collaborate with frontend engineers, data engineers, and DevOps teams to deliver robust solutions and troubleshoot, debug, and upgrade existing applications.
● Stay updated with backend development trends, tools, and frameworks to continuously improve processes.
● Utilize core crawling experience to design efficient strategies for scraping the data from different websites and applications.
● Collaborate with technology teams, data collection teams to build end to end technology-enabled ecosystems and partner in research projects to analyze the massive data inputs.
● Responsible for the design and development of web crawlers, able to independently solve various problems encountered in the actual development process.
● Stay updated with the latest web scraping techniques, tools, and industry trends to continuously improve the scraping processes.
Job Requirements
● 4+ years of professional experience in backend development with TypeScript and Python.
● Strong understanding of TypeScript-based server-side frameworks (e.g., Node.js, NestJS, Express) and Python frameworks (e.g., FastAPI, Django, Flask).
● Experience with tools and libraries for web scraping (e.g., Scrapy, BeautifulSoup, Selenium, Puppeteer)
● Hands-on experience with Temporal for creating and orchestrating workflows
● Proven hands-on experience in web scraping, including crawling, data extraction, deduplication, and handling dynamic websites.
● Proficient in implementing proxy solutions and handling bot-detection challenges (e.g., Cloudflare).
● Experience working with Docker, containerized deployments, and cloud environments (GCP or Azure).
● Proficiency with database systems such as MongoDB and Elastic Search.
● Hands-on experience with designing and maintaining scalable APIs.
● Knowledge of software testing practices (unit, integration, end-to-end).
● Familiarity with CI/CD pipelines and version control systems (Git).
● Strong problem-solving skills, attention to detail, and ability to work in agile environments.
● Great communication skills and ability to navigate in undirected situations.
Job Exposure:
● Opportunity to apply creative methods in acquiring and filtering the North American government, agencies data from various websites, sources
● In depth industry exposure on data harvesting techniques to build, scale the robust and sustainable model, using open-source applications ● Effectively collaboration with IT team to design the tailor-made solutions basis upon clients’ requirement
● Unique opportunity to research on various agencies, vendors, products as well as technology tools to compose a solution
Data Engineer
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Job Summary:
As an Angular Developer, you will be responsible for developing and maintaining web applications that are both visually appealing and highly functional across various devices. You will collaborate with cross-functional teams to deliver responsive and scalable solutions.
Key Responsibilities:
- Develop and maintain web applications using Angular framework.
- Implement responsive design techniques to ensure optimal user experiences across different devices.
- Collaborate with UX/UI designers to create intuitive and visually appealing interfaces.
- Optimize application performance and ensure high quality and responsiveness.
- Write clean, maintainable, and reusable code following best practices.
- Implement unit tests and participate in code reviews.
- Stay updated with the latest industry trends and technologies to ensure continuous improvement.
- Troubleshoot and debug issues in a timely manner.
- Participate in Agile/Scrum development processes.
Required Qualifications:
About Company:
Our client is one of the strongest Consumer brands in the Bakery category, having a 25000 sq ft state-of-the-art centralized manufacturing facility with European equipment near Ahmedabad, Gujarat. The founding team consists of a ‘Master Baker’ from Le Cordon Bleu, Paris, one of the finest culinary institutes in the world and an IIM-A alumni with a McKinsey background.
Position Overview:
The Exe/Assistant Manager Accounts will play a crucial role in managing and overseeing the financial operations of the company. This position involves maintaining accurate financial records, preparing financial statements, coordinating audits, and providing support to the finance team.
Responsibilities:
1. Data Entry – RTV, Revenue Assurance, Expense Analysis
2. General Ledger Management: Maintain the general ledger, recording transactions, reconciling accounts, and ensuring proper classification of financial data. Ensuring accuracy, compliance with accounting principles and standards. Expense Analysis. RTV approvals. Revenue Assurance coordination with Inventory & Ops team.
3. Accounts Payable and Receivable: Oversee accounts payable and receivable processes, ensuring timely and accurate processing of invoices, payments, and collections.
4. Tax Compliance: Collaborate with internal and external stakeholders to ensure accurate and timely filing of various taxes and returns such as GST, TDS, TCS, income tax, and corporate tax. Ensuring proper documentation and paper trail. Handling submissions against various department notices.
5. Audit Coordination: Prepare documentation and support the external audit process, addressing auditor inquiries and implementing audit recommendations.
6. Team Support: Provide guidance and training to junior staff members, fostering their professional growth within the finance department.
Qualifications:
- Bachelor's degree in Accounting, Finance, or related field; Inter CA or (CA multiple attempt) qualification preferred.
- 2-3 years of relevant experience in accounting or finance roles, with demonstrated progression in responsibilities.
- Proficient in financial software and tools, such as Excel, accounting software (e.g., QuickBooks, SAP), and ERP systems.
- Strong knowledge of accounting principles, financial reporting, and taxation.
- Excellent analytical skills and attention to detail.
- Effective communication and interpersonal skills, with the ability to collaborate across departments.
- Problem-solving mindset and ability to work under pressure in a fast-paced environment.
- 3 - 5 years of progressive IT experience with a minimum of 1+ years in building Ruby on Rails Web applications and RESTful API endpoints
- Proficient with scripting languages, specifically Python and Bash
- Experience in Object Oriented Java Script.
- Must have Linux command line experience.
- Good experience in developing and integrating reliable REST web services.
- Possess good communication, organizational and creative thinking skills.
- Multi-tasking individual that likes to work both in a team environment and individually
- Excellent technical abilities, leadership, decision making, and adaptability to new technology
- Good knowledge of Agile processes and practice
- Team player
Experience - Minimum 2+ Year
CTC - Upto 8.00L pa ( Depending on Experience )
Job Description:
1. Execution of trade orders on behalf of clients.
2. Building relationships with clients & educating them about Investments.
3. Client Acquisition as per targets
Required Criteria:
1. Graduation Mandate.
2. NISM 8 certification (Equity & derivatives) / NISM 5 certification (Mutual Funds).
3. Stock Market knowledge is mandatory.
4. Should possess good communication skills.
5. Dealing & Cross Sales Experience is must
NOTE -Candidates having excellent knowledge of Equity, order punching on terminals, Future & Options,
Job Description/ Responsibilities:
PHP/ MySQL Development & Support for diverse Web-based applications using MVC.
Knowledge of web development like Ecommerce, CMS, Custom Application Development, Blog, Social Networking and custom PHP.
Open Source (codeigniter,Laravel. . )
Fruxinfo Private Limited
Corporate office D 608 Titanium City Centre Nr, 100 Feet Anand Nagar Rd, opp. Seema Hall, Ahmedabad
https://www.fruxinfo.com/">https://www.fruxinfo.com/
Job Location for Developer : Bhavana estate Narol ,
Looking for a technical recruiter for a US Staffing Division.
Require excellent command over the english language, Should have experience of working with IT and Non-IT requirements. Should be able to source candidates as per the given job description.
PRAXINFO Hiring DevOps Engineer.
Position : DevOps Engineer
Job Location : C.G.Road, Ahmedabad
EXP : 1-3 Years
Salary : 40K - 50K
Required skills:
⦿ Good understanding of cloud infrastructure (AWS, GCP etc)
⦿ Hands on with Docker, Kubernetes or ECS
⦿ Ideally strong Linux background (RHCSA , RHCE)
⦿ Good understanding of monitoring systems (Nagios etc), Logging solutions (Elastisearch etc)
⦿ Microservice architectures
⦿ Experience with distributed systems and highly scalable systems
⦿ Demonstrated history in automating operations processes via services and tools ( Puppet, Ansible etc)
⦿ Systematic problem-solving approach coupled with a strong sense of ownership and drive.
If anyone is interested than share your resume at hiring at praxinfo dot com!
#linux #devops #engineer #kubernetes #docker #containerization #python #shellscripting #git #jenkins #maven #ant #aws #RHCE #puppet #ansible




