Cutshort logo
Symansys Technologies India Pvt Ltd logo
Business Intelligence Developer
Business Intelligence Developer
Symansys Technologies India Pvt Ltd's logo

Business Intelligence Developer

6 - 9 yrs
₹15L - ₹23L / yr
Remote, Noida
Skills
SSIS
Tableau
SQL Server Integration Services (SSIS)
Intelligence
Business Intelligence (BI)
ETL
Datawarehousing
We are Hiring For BI Developer.
Experience: 6-9 yrs
Location: NoidaJob Description:
  • Must Have 3-4 Experience in SSIS, Mysql
  • Good Experience in Tableau
  • Experience in SQL Server.
  • 1+ year of Experience in Tableau
  • Knowledge of ETL Tool
  • Knowledge of Dataware Housing
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Symansys Technologies India Pvt Ltd

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Amazech Systems pvt Ltd
Remote only
5 - 7 yrs
₹8L - ₹13L / yr
ADF
Apache Synapse
SSIS
SQL
ETL
+11 more

 Hiring for Azure Data Engineers.

Location: Bangalore

Employment type: Full-time, permanent

website: www.amazech.com

 

Qualifications: 

B.E./B.Tech/M.E./M.Tech in Computer Science, Information Technology, Electrical or Electronic with good academic background.


Experience and Required Skill Sets:


•       Minimum 5 years of hands-on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse, Azure Blob, Azure Storage Explorer

•       Experience in Data warehouse/analytical systems using Azure Synapse.

Proficient in creating Azure Data Factory pipelines for ETL processing; copy activity, custom Azure development, Synapse, etc.

•       Knowledge of Azure Data Catalog, Event Grid, Service Bus, SQL, and Purview.

•       Good technical knowledge in Microsoft SQL Server BI Suite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI

•       Design and develop batch and real-time streaming of data loads to data warehouse systems

 

 Other Requirements:


A Bachelor's or Master's degree (Engineering or computer-related degree preferred)

Strong understanding of Software Development Life Cycles including Agile/Scrum


Responsibilities: 

•       Ability to create complex, enterprise-transforming applications that meet and exceed client expectations. 

•       Responsible for the bottom line. Strong project management abilities. Ability to encourage the team to stick to timelines.

Read more
Softobiz Technologies Private limited
at Softobiz Technologies Private limited
2 candid answers
1 recruiter
Adlin Asha
Posted by Adlin Asha
Hyderabad
8 - 18 yrs
₹15L - ₹30L / yr
ETL
Informatica
Data Warehouse (DWH)
Amazon Redshift
skill iconPostgreSQL
+2 more

Experience: 8+ Years

Work Location: Hyderabad

Mode of work: Work from Office


Senior Data Engineer / Architect

 

Summary of the Role

 

The Senior Data Engineer / Architect will be a key role within the data and technology team, responsible for engineering and building data solutions that enable seamless use of data within the organization. 

 

Core Activities

-         Work closely with the business teams and business analysts to understand and document data usage requirements

-         Develop designs relating to data engineering solutions including data pipelines, ETL, data warehouse, data mart and data lake solutions

-         Develop data designs for reporting and other data use requirements

-         Develop data governance solutions that provide data governance services including data security, data quality, data lineage etc.

-         Lead implementation of data use and data quality solutions

-         Provide operational support for users for the implemented data solutions

-         Support development of solutions that automate reporting and business intelligence requirements

-         Support development of machine learning and AI solution using large scale internal and external datasets

 

Other activities

-         Work on and manage technology projects as and when required

-         Provide user and technical training on data solutions

 

Skills and Experience

-         At least 5-8 years of experience in a senior data engineer / architect role

-         Strong experience with AWS based data solutions including AWS Redshift, analytics and data governance solutions

-         Strong experience with industry standard data governance / data quality solutions  

-         Strong experience with managing a Postgres SQL data environment

-         Background as a software developer working in AWS / Python will be beneficial

-         Experience with BI tools like Power BI and Tableau

Strong written and oral communication skills

 

Read more
Latent Bridge Pvt Ltd
at Latent Bridge Pvt Ltd
6 recruiters
Mansoor Khan
Posted by Mansoor Khan
Remote only
3 - 7 yrs
₹5L - ₹20L / yr
MicroStrategy administration
skill iconAmazon Web Services (AWS)
Business Intelligence (BI)
MSTR

Familiar with the MicroStrategy architecture, Admin Certification Preferred

· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes

· Monitor and manage existing Business Intelligence development/production systems

· MicroStrategy installation, upgrade and administration on Windows and Linux platform

· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.

· Analyze application and system logs while troubleshooting and root cause analysis

· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.

· Monitor, report and investigate solutions to improve report performance.

· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.

· Provide support for the platform, report execution and implementation, user community and data investigations.

· Identify improvement areas in Environment hosting and upgrade processes.

· Identify automation opportunities and participate in automation implementations

· Provide on-call support for Business Intelligence issues

· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.

· Familiar with AWS, Linux Scripting

· Knowledge of MSTR Mobile

· Knowledge of capacity planning and system’s scaling needs

Read more
Think n Solutions
at Think n Solutions
2 recruiters
TnS HR
Posted by TnS HR
Remote only
1.5 - 6 yrs
Best in industry
Microservices
RESTful APIs
Microsoft SQL Server
SQL Server Integration Services (SSIS)
SQL Server Reporting Services (SSRS)
+10 more

*Apply only if you are serving Notice Period


HIRING SQL Developers
with max 20 days Of NOTICE PERIOD


Job ID: TNS2023DB01

Who Should apply?

  • Only for Serious job seekers who are ready to work in night shift
  • Technically Strong Candidates who are willing to take up challenging roles and want to raise their Career graph
  • No DBAs & BI Developers, please

 

Why Think n Solutions Software?

  • Exposure to the latest technology
  • Opportunity to work on different platforms
  • Rapid Career Growth
  • Friendly Knowledge-Sharing Environment

 

Criteria:

  • BE/MTech/MCA/MSc
  • 2yrs Hands Experience in MS SQL / NOSQL
  • Immediate joiners preferred/ Maximum notice period between 15 to 20 days
  • Candidates will be selected based on logical/technical and scenario-based testing
  • Work time - 10:00 pm to 6:00 am

 

Note: Candidates who have attended the interview process with TnS in the last 6 months will not be eligible.

 

 

Job Description:

 

  1. Technical Skills Desired:
    1. Experience in MS SQL Server, and one of these Relational DB’s, PostgreSQL / AWS Aurora DB / MySQL / any of NoSQL DBs (MongoDB / DynamoDB / DocumentDB) in an application development environment and eagerness to switch
    2. Design database tables, views, indexes
    3. Write functions and procedures for Middle Tier Development Team
    4. Work with any front-end developers in completing the database modules end to end (hands-on experience in the parsing of JSON & XML in Stored Procedures would be an added advantage).
    5. Query Optimization for performance improvement
    6. Design & develop SSIS Packages or any other Transformation tools for ETL

 

  1. Functional Skills Desired:
    1. The banking / Insurance / Retail domain would be a
    2. Interaction with a client a

 

3.       Good to Have Skills:

  1. Knowledge in a Cloud Platform (AWS / Azure)
  2. Knowledge on version control system (SVN / Git)
  3. Exposure to Quality and Process Management
  4. Knowledge in Agile Methodology

 

  1. Soft skills: (additional)
    1. Team building (attitude to train, work along, and mentor juniors)
    2. Communication skills (all kinds)
    3. Quality consciousness
    4. Analytical acumen to all business requirements
    5. Think out-of-box for business solution
Read more
Srijan Technologies
at Srijan Technologies
6 recruiters
PriyaSaini
Posted by PriyaSaini
Remote only
3 - 8 yrs
₹5L - ₹12L / yr
skill iconData Analytics
Data modeling
skill iconPython
PySpark
ETL
+3 more

Role Description:

  • You will be part of the data delivery team and will have the opportunity to develop a deep understanding of the domain/function.
  • You will design and drive the work plan for the optimization/automation and standardization of the processes incorporating best practices to achieve efficiency gains.
  • You will run data engineering pipelines, link raw client data with data model, conduct data assessment, perform data quality checks, and transform data using ETL tools.
  • You will perform data transformations, modeling, and validation activities, as well as configure applications to the client context. You will also develop scripts to validate, transform, and load raw data using programming languages such as Python and / or PySpark.
  • In this role, you will determine database structural requirements by analyzing client operations, applications, and programming.
  • You will develop cross-site relationships to enhance idea generation, and manage stakeholders.
  • Lastly, you will collaborate with the team to support ongoing business processes by delivering high-quality end products on-time and perform quality checks wherever required.

Job Requirement:

  • Bachelor’s degree in Engineering or Computer Science; Master’s degree is a plus
  • 3+ years of professional work experience with a reputed analytics firm
  • Expertise in handling large amount of data through Python or PySpark
  • Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools
  • Experience of deploying ETL / data pipelines and workflows in cloud technologies and architecture such as Azure and Amazon Web Services will be valued
  • Comfort with data modelling principles (e.g. database structure, entity relationships, UID etc.) and software development principles (e.g. modularization, testing, refactoring, etc.)
  • A thoughtful and comfortable communicator (verbal and written) with the ability to facilitate discussions and conduct training
  • Strong problem-solving, requirement gathering, and leading.
  • Track record of completing projects successfully on time, within budget and as per scope

Read more
Pinghala
at Pinghala
1 recruiter
Ashwini Dhaipule
Posted by Ashwini Dhaipule
Pune
3 - 5 yrs
₹6L - ₹10L / yr
PowerBI
Data Visualization
Data architecture
Informatica PowerCenter
SQL
+5 more

Pingahla is recruiting business intelligence Consultants/Senior consultants who can help us with Information Management projects (domestic, onshore and offshore) as developers and team leads. The candidates are expected to have 3-6 years of experience with Informatica Power Center/Talend DI/Informatica Cloud and must be very proficient with Business Intelligence in general. The job is based out of our Pune office.

Responsibilities:

  • Manage the customer relationship by serving as the single point of contact before, during and after engagements.
  • Architect data management solutions.
  • Provide technical leadership to other consultants and/or customer/partner resources.
  • Design, develop, test and deploy data integration solutions in accordance with customer’s schedule.
  • Supervise and mentor all intermediate and junior level team members.
  • Provide regular reports to communicate status both internally and externally.
  • Qualifications:
  • A typical profile that would suit this position would be if the following background:
  • A graduate from a reputed engineering college 
  • An excellent I.Q and analytical skills and should be able to grasp new concepts and learn new technologies.
  • A willingness to work with a small team in a fast-growing environment.
  • A good knowledge of Business Intelligence concepts

 

Mandatory Requirements:

  • Knowledge of Business Intelligence
  • Good knowledge of at least one of the following data integration tools - Informatica Powercenter, Talend DI, Informatica Cloud
  • Knowledge of SQL
  • Excellent English and communication skills
  • Intelligent, quick to learn new technologies
  • Track record of accomplishment and effectiveness with handling customers and managing complex data management needs
     

 

Read more
Symansys Technologies India Pvt Ltd
Tanu Chauhan
Posted by Tanu Chauhan
Pune, Mumbai
2 - 8 yrs
₹5L - ₹15L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
Tableau
skill iconR Programming
+1 more

Specialism- Advance Analytics, Data Science, regression, forecasting, analytics, SQL, R, python, decision tree, random forest, SAS, clustering classification

Senior Analytics Consultant- Responsibilities

  • Understand business problem and requirements by building domain knowledge and translate to data science problem
  • Conceptualize and design cutting edge data science solution to solve the data science problem, apply design thinking concepts
  • Identify the right algorithms , tech stack , sample outputs required to efficiently adder the end need
  • Prototype and experiment the solution to successfully demonstrate the value
    Independently or with support from team execute the conceptualized solution as per plan by following project management guidelines
  • Present the results to internal and client stakeholder in an easy to understand manner with great story telling, story boarding, insights and visualization
  • Help build overall data science capability for eClerx through support in pilots, pre sales pitches, product development , practice development initiatives
Read more
netmedscom
at netmedscom
3 recruiters
Vijay Hemnath
Posted by Vijay Hemnath
Chennai
2 - 5 yrs
₹6L - ₹25L / yr
Big Data
Hadoop
Apache Hive
skill iconScala
Spark
+12 more

We are looking for an outstanding Big Data Engineer with experience setting up and maintaining Data Warehouse and Data Lakes for an Organization. This role would closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.

Roles and Responsibilities:

  • Develop and maintain scalable data pipelines and build out new integrations and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using 'Big Data' technologies.
  • Develop programs in Scala and Python as part of data cleaning and processing.
  • Assemble large, complex data sets that meet functional / non-functional business requirements and fostering data-driven decision making across the organization.  
  • Responsible to design and develop distributed, high volume, high velocity multi-threaded event processing systems.
  • Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Provide high operational excellence guaranteeing high availability and platform stability.
  • Closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.

Skills:

  • Experience with Big Data pipeline, Big Data analytics, Data warehousing.
  • Experience with SQL/No-SQL, schema design and dimensional data modeling.
  • Strong understanding of Hadoop Architecture, HDFS ecosystem and eexperience with Big Data technology stack such as HBase, Hadoop, Hive, MapReduce.
  • Experience in designing systems that process structured as well as unstructured data at large scale.
  • Experience in AWS/Spark/Java/Scala/Python development.
  • Should have Strong skills in PySpark (Python & SPARK). Ability to create, manage and manipulate Spark Dataframes. Expertise in Spark query tuning and performance optimization.
  • Experience in developing efficient software code/frameworks for multiple use cases leveraging Python and big data technologies.
  • Prior exposure to streaming data sources such as Kafka.
  • Should have knowledge on Shell Scripting and Python scripting.
  • High proficiency in database skills (e.g., Complex SQL), for data preparation, cleaning, and data wrangling/munging, with the ability to write advanced queries and create stored procedures.
  • Experience with NoSQL databases such as Cassandra / MongoDB.
  • Solid experience in all phases of Software Development Lifecycle - plan, design, develop, test, release, maintain and support, decommission.
  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development).
  • Experience building and deploying applications on on-premise and cloud-based infrastructure.
  • Having a good understanding of machine learning landscape and concepts. 

 

Qualifications and Experience:

Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Big Data Engineer or a similar role for 3-5 years.

Certifications:

Good to have at least one of the Certifications listed here:

    AZ 900 - Azure Fundamentals

    DP 200, DP 201, DP 203, AZ 204 - Data Engineering

    AZ 400 - Devops Certification

Read more
Product based Company
Agency job
via Crewmates by Gowtham V
Coimbatore
4 - 15 yrs
₹5L - ₹25L / yr
ETL
Big Data
Hi Professionals,
We are looking for ETL Developer for Reputed Client @ Coimbatore Permanent role
Work Location : Coimbatore
Experience : 4+ Years
Skills ;
  •  Talend (or)Strong experience in any of the ETL Tools like (Informatica/Datastage/Talend)
  • DB preference (Teradata /Oracle /Sql server )
  • Supporting Tools (JIRA/SVN)
Notice Period : Immediate to 30 Days
Read more
LatentView Analytics
at LatentView Analytics
3 recruiters
Kannikanti madhuri
Posted by Kannikanti madhuri
Chennai
5 - 8 yrs
₹5L - ₹8L / yr
skill iconData Science
Analytics
skill iconData Analytics
Data modeling
Data mining
+7 more
Job Overview :We are looking for an experienced Data Science professional to join our Product team and lead the data analytics team and manage the processes and people responsible for accurate data collection, processing, modelling and analysis. The ideal candidate has a knack for seeing solutions in sprawling data sets and the business mindset to convert insights into strategic opportunities for our clients. The incumbent will work closely with leaders across product, sales, and marketing to support and implement high-quality, data-driven decisions. They will ensure data accuracy and consistent reporting by designing and creating optimal processes and procedures for analytics employees to follow. They will use advanced data modelling, predictive modelling, natural language processing and analytical techniques to interpret key findings.Responsibilities for Analytics Manager :- Build, develop and maintain data models, reporting systems, data automation systems, dashboards and performance metrics support that support key business decisions.- Design and build technical processes to address business issues.- Manage and optimize processes for data intake, validation, mining and engineering as well as modelling, visualization and communication deliverables.- Examine, interpret and report results to stakeholders in leadership, technology, sales, marketing and product teams.- Develop and implement quality controls and standards to ensure quality standards- Anticipate future demands of initiatives related to people, technology, budget and business within your department and design/implement solutions to meet these needs.- Communicate results and business impacts of insight initiatives to stakeholders within and outside of the company.- Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities.Qualifications for Analytics Manager :- Working knowledge of data mining principles: predictive analytics, mapping, collecting data from multiple cloud-based data sources- Strong SQL skills, ability to perform effective querying- Understanding of and experience using analytical concepts and statistical techniques: hypothesis development, designing tests/experiments, analysing data, drawing conclusions, and developing actionable recommendations for business units.- Experience and knowledge of statistical modelling techniques: GLM multiple regression, logistic regression, log-linear regression, variable selection, etc.- Experience working with and creating databases and dashboards using all relevant data to inform decisions.- Strong problem solving, quantitative and analytical abilities.- Strong ability to plan and manage numerous processes, people and projects simultaneously.- Excellent communication, collaboration and delegation skills.- We- re looking for someone with at least 5 years of experience in a position monitoring, managing and drawing insights from data, and someone with at least 3 years of experience leading a team. The right candidate will also be proficient and experienced with the following tools/programs :- Strong programming skills with querying languages: R, Python etc.- Experience with big data tools like Hadoop- Experience with data visualization tools: Tableau, d3.js, etc.- Experience with Excel, Word, and PowerPoint.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos