11+ Economics Jobs in Hyderabad | Economics Job openings in Hyderabad
Apply to 11+ Economics Jobs in Hyderabad on CutShort.io. Explore the latest Economics Job opportunities across top companies like Google, Amazon & Adobe.

Lead the development of advanced quantitative trading models and mentor junior engineers.
Minimum Qualification: Bachelor's or Master's degree in any field, provided the course includes Mathematics or Economics as a subject.
Experience: 4+ years of professional experience in any field, with a strong foundation in Mathematics or Economics.
Locations: London, New York, Mumbai, Hyderabad
How to Apply:
- Login to tacoi.paromint.com.
- Navigate to your Profile.
- Copy your wallet public address from the app.
- Include your wallet public address.
- Mention the job code SQE002.
- Attach your resume to the email.
Note: Candidates will be selected for interview based on their hedging ability, especially in options and commodity derivatives on tacoi.paromint.com.
Jr. Testing Engineer
Are you a convergent thinker and a problem solver who loves finding and solving bugs? Then you have an opportunity to grab at SmartDocs to fulfill your desired goals.
We are actively seeking for a Jr. Testing Engineer who is aimed at delivering the bug-free quality products to the clients. The ideal candidate must possess critical thinking and can diplomatically communicate within, and outside their areas of responsibility.
Roles & Responsibilities:
- Must be able to write highly efficient manual test cases and perform functional, regression and other testing levels based on business requirements.
- Experience with test methodologies, writing test plans, creating test cases and debugging.
- Working closely with the product team in understanding the requirements and workflow.
- Designing, developing and executing automation test scripts.
- Raising defects/bugs and tracking them till closure. Provide support and documentation.
- Collaborate closely with other team members and departments.
- Execute all levels of testing (System, Integration, and Regression etc).
- Detect and track software defects and inconsistencies.
- Automate tests using test frameworks.
Requirements:
- Bachelor's degree in Computer Science or in the relevant field.
- Minimum 0.6 - 1 years of work experience in Automation testing.
- If you are a fresher, a certification course done on Manual and Automation testing is mandatory.
- Ability to work in a fast-paced environment with minimal supervision.
- Critical thinking and problem-solving skills.
- Great interpersonal and communication skills.
- Experience in using a defect tracking system to report, track and resolve defects.
- Good understanding of the Agile software development methodology.
- Passion for software quality assurance, problem detection and analysis.
Job Description
Education : Graduation in Computer Science
Job Description :
Around 3 years of professional experience in Oracle Applications 11i / R12 as Technical Consultant.
Expertise in developing the reports using Report Builder , Discoverer and XML Publisher. Having Technical knowledge in various modules like OM , FA , GL , AP , PO , INV , AR , CM and AOL.
Oracle Database Upgrade 9i to 11g.
Database Patching , Space Management Purging Archiving , Connectivity Issue Tablespace Creation , Modification , Deletion , Troubleshooting Concurrent Manager , Monitoring Application Performance
Excellent Knowledge in Oracle (SQL , PL / SQL , Reports 10g).
Good exposure with SQL Loader.
Good Knowledge on Technical architecture and functional flow of Oracle Applications modules such as Account Payable , Account Receivables , Purchasing , General Ledger , Fixed Assets , Inventory and Order Management.
Having good experience in designing and preparation of documents likeMD70 , MD120 and TE20 as a part of Object Deliverables.
Worked on Inbound Interfaces for e.g. (AP Invoice , GL Interface , AR Invoice etc.).
Experience in designing and development of Conversion Programs to load data from Legacy system to Oracle Applications with effective validation logic and Error handling mechanism (Supplier Conversion , Item Conversion).

AWS Glue Developer
Work Experience: 6 to 8 Years
Work Location: Noida, Bangalore, Chennai & Hyderabad
Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops,
Job Reference ID:BT/F21/IND
Job Description:
Design, build and configure applications to meet business process and application requirements.
Responsibilities:
7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.
Technical Experience:
Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.
➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.
➢ Create data pipeline architecture by designing and implementing data ingestion solutions.
➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.
➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.
➢ Author ETL processes using Python, Pyspark.
➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.
➢ ETL process monitoring using CloudWatch events.
➢ You will be working in collaboration with other teams. Good communication must.
➢ Must have experience in using AWS services API, AWS CLI and SDK
Professional Attributes:
➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.
➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.
➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.
Qualification:
➢ Degree in Computer Science, Computer Engineering or equivalent.
Salary: Commensurate with experience and demonstrated competence
Technology: node js, DynamoDB / Mongo DB
Roles:
- Design & implement Backend Services.
- Able to redesign the architecture.
- Designing & implementation of application in MVC & Microservice.
- 9+ years of experience developing service-based applications using Node.js.
- Expert-level skills in developing web applications using JavaScript, CSS and HTML5.
- Experience working on teams that practice BDD (Business Driven Development).
- Understanding of micro-service architecture and RESTful API integration patterns.
- Experience using Node.js for automation and leveraging NPM for package management
- Solid Object Oriented design experience, and creating and leveraging design patterns.
- Experience working in a DevOps/Continuous Delivery environment and associated toolsets (i.e. Jenkins, Puppet etc.)
Desired/Preferred Qualifications :
- Bachelor's degree or equivalent experience
- Strong problem solving and conceptual thinking abilities
- Desire to work in a collaborative, fast-paced, start-up like environment
- Experience leveraging node.js frameworks such as Express.
- Experience with distributed source control management, i.e. Git
Responsibilities:
- Draft proposal portions such as introduction, overview, project approach, and cover letter
- Draft proposal template to hand off to subject matter expert and writing teams
- Develop and execute consistent company identity through document templates, letterhead, and logo usage
- Lead proposal team on RFP, RFQ, award submittals, and roster application responses
- Synthesize proposal materials into final client-ready package
- Maintain proposal milestones schedules like issue and review dates, kickoff meetings, and due dates.
- Engage in proposal sections coordination, final formatting, assembly, packaging, and delivery.
- Ensuring documents are secure and up to company standards
- Facilitating the proposal review process
What you will bring:
- Responsible for coordinating proposals
- Coordinate and maintain team documentation efforts for responses to RFPs
- Analyze requirements and ensure that proposals meet requirements
- Edit and rewrite proposals, including creating templates and boilerplate text
- Draft proposals and communicate across teams to get input and meet deadlines


- Hands on Expert in UI/Front End Engineering/Development specifically React,
- Redux, NodeJs who can code, design and own front end development and code.
- Experience in React v16, Reach Redux, React Fibre and JavaScript advance (ES6+),
- Experience in Advance State Management [Example: Redux, Mobx, RxJs, GraphQL etc]
- Good experience on UI framework like Angular JS/ Vue JS/ Ext JS/jQuery/ React JS or equivalent SPA framework
- Experience in HTML5, Javascript CSS3, ExpressJs, GraphQL
- Strong understanding of Rest APIs, Web Sockets, Service Workers, HTTP/S Protocol, Web Security and Cloud Infrastructure, CI/CD, Web Packaging and Optimization, UX and Styling.
- Expertise in Advance Java-Scripts
- Knowledge and experience with Data Structure and Design methodologies.
- Problem solving skill
- Good experience working in Agile/Scrum teams
- Good written and verbal communication skills.


- We need people with Test complete + Python.
- It's well and good to have Test Complete knowledge but Python knowledge is mandatory.
- C# is good to have too.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for technical leaders with passion and experience in architecting and delivering high-quality distributed systems at massive scale.
Responsibilities & ownership
- Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
- Lead and mentor others about concurrency, parallelization to deliver scalability, performance and resource optimization in a multithreaded and distributed environment
- Propose and promote strategic company-wide tech investments taking care of business goals, customer requirements, and industry standards
- Lead the team to solve complex, unknown and ambiguous problems, and customer issues cutting across team and module boundaries with technical expertise, and influence others
- Review and influence designs of other team members
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Partner with other leaders to nurture innovation and engineering excellence in the team
- Drive priorities with others to facilitate timely accomplishments of business objectives
- Perform RCA of customer issues and drive investments to avoid similar issues in future
- Collaborate with Product Management, Support, and field teams to ensure that customers are successful with Dremio
- Proactively suggest learning opportunities about new technology and skills, and be a role model for constant learning and growth
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 15+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models and their use in developing distributed and scalable systems
- 8+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Subject Matter Expert in one or more of query processing or optimization, distributed systems, concurrency, micro service based architectures, data replication, networking, storage systems
- Experience in taking company-wide initiatives, convincing stakeholders, and delivering them
- Expert in solving complex, unknown and ambiguous problems spanning across teams and taking initiative in planning and delivering them with high quality
- Ability to anticipate and propose plan/design changes based on changing requirements
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Hands-on experience of working projects on AWS, Azure, and GCP
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and GCP)
- Understanding of distributed file systems such as S3, ADLS or HDFS
- Excellent communication skills and affinity for collaboration and teamwork

