Loading...

{{notif_text}}

The next CutShort event, {{next_cs_event.name}}, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, is LIVE.Join now!
{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more
The job you are looking for has expired or has been deleted. Check out similar jobs below.

Similar jobs

BI Developer

Founded 2004
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 7 years
Experience icon
2 - 7 lacs/annum

Responsibilities 1. Design & develop detail ETL specification based on the business requirement using SSIS & Oracle Procedures. 2. Good understanding of Data Warehousing & Dimensional Modelling concepts. 3. Develops, tests and implements enterprise data movement (ETL and CDC) solutions. 4. Design, develop & test SSRS reports & dashboards based on large amount of data sourcing from multidimensional analysis cube. 5. Ability to translate business needs into the technical solutions by working with business and IT stakeholders 6. Assist in the effort estimation and planning for SSIS & SSRS implementations 7. Troubleshoot & provide solutions based on the priorities for any issues being reported by client in ETL loads and reports. 8. Optimization of ETL processing time using industry best practices & standards to provide the report in timely manner. 9. Support in documenting solution design, user guide, troubleshooting guide wherever required. 10. Design & development ETL using SSIS. Required Skills 1. Minimum 3+ years of working experience in ETL, Data Warehousing using MSBI (SSRS, SSIS, Power BI) 2. Extensive experience working with SSIS, Power BI for developing management dashboard. 3. Experience in working with large scale data warehouse implementation project using MSBI & SQL Server. 4. Extensive experience in design and development of SSIS & automation of its processing. 5. Extensive experience in design and development of SSRS visualization & reports considering multiple sources. 6. Experience in working with SQL / PLSQL scripting using SQL Server. 7. Extensive experience to optimize the performance of ETL to match the assigned execution windows. 8. Experience in handling large amount of data coming from disparate systems. 9. Good to have the working experience in the Insurance Brokerage Domain.

Job posted by
apply for job
apply for job
Jyothi Jose picture
Jyothi Jose
Job posted by
Jyothi Jose picture
Jyothi Jose
Apply for job
apply for job

Dot net Developer

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Hyderabad
Experience icon
6 - 8 years
Experience icon
6 - 13 lacs/annum

1.Net (C# )- OMAR web application(portal) developed in C# .Net and running through the web server(IIS). 2. SQL server –Loads the data from various systems(Informatica,CTM,SNOW,HDI) and maintain it in OMAR database. 3.SSIS/ INFA –For Loading data from various source systems and generate the reports (excel format/CSV format/auto email with URL). 4.control-M /SQL Agent- CTM used to schedule the SSIS/INFA jobs .SQL Agent to schedule the SQL jobs(stored procedures). 5. Windows services – used to copy the active job CSV file from Control-M shared drive path to ETL shared drive path. To send the OMAR job failures notifications for some of the jobs ,when the data in OMAR is not up to date. To run the OMAR balancing service when the EDW claim load completes. This service runs for every 5 mins and keeps checking for the claim load completion. On completion balancing SSIS packages triggers through the windows service ,which is written in C#.net code.

Job posted by
apply for job
apply for job
Tejasree Paruchuri picture
Tejasree Paruchuri
Job posted by
Tejasree Paruchuri picture
Tejasree Paruchuri
Apply for job
apply for job

Database Architect

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Experience icon
10 - 20 lacs/annum

candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts

Job posted by
apply for job
apply for job
Rahul Malani picture
Rahul Malani
Job posted by
Rahul Malani picture
Rahul Malani
Apply for job
apply for job

Fiarano consulatants
at THBS

Founded 1998
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 8 years
Experience icon
7 - 12 lacs/annum

About the Company: The company focuses on high-end, niche technical skills, predominantly in SOA, Integration, Cloud, Mobile and Big Data. THBS provides software services to enterprise clients across different industry verticals through a combination of offshore and onsite services. The company has its development center in Bangalore (India) and Sales & Operations offices are established in Bristol (UK), New Jersey (US), Dubai, Spain, Germany, Austria and Paris to have a reach to the local market. The company has been CMMi Level 3 certified. It has also been certified to comply with British Security Standards 7799 (now termed as ISO 27001). About the requirement: Looking for Fiarano consultants for Bangalore location with 4-8 yrs of fiarano ESB experience. the candidate should have end to end understanding of requirement with good communication skills

Job posted by
apply for job
apply for job
Lavanya M Rao picture
Lavanya M Rao
Job posted by
Lavanya M Rao picture
Lavanya M Rao
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.