About Anvizent - A Product of DW Practice
Similar jobs
Key Responsibilities:
- Conduct business analysis and research to identify social media trends, content creators, market competition and potential consumers for the product
- Help in developing processes and procedures to ensure business solutions meet strategic goals
- Collaborate with team members to collect, analyze, and evaluate information from multiple sources and present findings and recommendations to stakeholders
- Analyze and verify requirements that will ensure the product’s consistency, comprehensibility, feasibility, and conformity to industry standards
- Write and document business requirements, functional requirements, and design specifications
- Continuously improve existing business processes and strive to develop new ones to improve efficiency.
Technical Skills:
- Able to exercise independent judgment and take action on it
- Excellent analytical, mathematical, and creative problem-solving skills
- Understand the software life cycle and basics of programming, testing, algorithms etc.
- Excellent listening, interpersonal, written, and oral communication skills
- Logical and efficient, with keen attention to detail
- Highly self-motivated and directed
- Ability to effectively prioritize and execute tasks while under pressure
- Strong customer service orientation
- Experience working in a team-oriented, collaborative environment.
Micro strategy Admin
Familiar with the MicroStrategy architecture, Admin Certification Preferred
· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes
· Monitor and manage existing Business Intelligence development/production systems
· MicroStrategy installation, upgrade and administration on Windows and Linux platform
· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.
· Analyze application and system logs while troubleshooting and root cause analysis
· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.
· Monitor, report and investigate solutions to improve report performance.
· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.
· Provide support for the platform, report execution and implementation, user community and data investigations.
· Identify improvement areas in Environment hosting and upgrade processes.
· Identify automation opportunities and participate in automation implementations
· Provide on-call support for Business Intelligence issues
· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.
· Familiar with AWS, Linux Scripting
· Knowledge of MSTR Mobile
· Knowledge of capacity planning and system’s scaling needs
Senior Azure Data Developer
at a global provider of Business Process Management company
Desired Competencies:
Ø Expertise in Azure Data Factory V2
Ø Expertise in other Azure components like Data lake Store, SQL Database, Databricks
Ø Must have working knowledge of spark programming
Ø Good exposure to Data Projects dealing with Data Design and Source to Target documentation including defining transformation rules
Ø Strong knowledge of CICD Process
Ø Experience in building power BI reports
Ø Understanding of different components like Pipelines, activities, datasets & linked services
Ø Exposure to dynamic configuration of pipelines using data sets and linked Services
Ø Experience in designing, developing and deploying pipelines to higher environments
Ø Good knowledge on File formats for flexible usage, File location Objects (SFTP, FTP, local, HDFS, ADLS, BLOB, Amazon S3 etc.)
Ø Strong knowledge in SQL queries
Ø Must have worked in full life-cycle development from functional design to deployment
Ø Should have working knowledge of GIT, SVN
Ø Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.
Ø Should have working knowledge of different resources available in Azure like Storage Account, Synapse, Azure SQL Server, Azure Data Bricks, Azure Purview
Ø Any experience related to metadata management, data modelling, and related tools (Erwin or ER Studio or others) would be preferred
Preferred Qualifications:
Ø Bachelor's degree in Computer Science or Technology
Ø Proven success in contributing to a team-oriented environment
Ø Proven ability to work creatively and analytically in a problem-solving environment
Ø Excellent communication (written and oral) and interpersonal skills
Qualifications
BE/BTECH
KEY RESPONSIBILITIES :
You will join a team designing and building a data warehouse covering both relational and dimensional models, developing reports, data marts and other extracts and delivering these via SSIS, SSRS, SSAS, and PowerBI. It is seen as playing a vital role in delivering a single version of the truth on Client’s data and delivering MI & BI that will feature in enabling both operational and strategic decision making. You will be able to take responsibility for projects over the entire software lifecycle and work with minimum supervision. This would include technical analysis, design, development, and test support as well as managing the delivery to production. The initial project being resourced is around the development and implementation of a Data Warehouse and associated MI/BI functions. |
Principal Activities: 1. Interpret written business requirements documents 2. Specify (High Level Design and Tech Spec), code and write automated unit tests for new aspects of MI/BI Service. 3. Write clear and concise supporting documentation for deliverable items. 4. Become a member of the skilled development team willing to contribute and share experiences and learn as appropriate. 5. Review and contribute to requirements documentation. 6. Provide third line support for internally developed software. 7. Create and maintain continuous deployment pipelines. 8. Help maintain Development Team standards and principles. 9. Contribute and share learning and experiences with the greater Development team. 10. Work within the company’s approved processes, including design and service transition. 11. Collaborate with other teams and departments across the firm. 12. Be willing to travel to other offices when required. |
Location – Bangalore
Job description:
- Design, develop and maintain complex Tableau reports for scalability, manageability, extensibility, performance, and re-use
- Work with team members to create useful reports and dashboards that provide insight, improve/automate processes, or otherwise add value to the team
- Write complex SQL queries to automate dashboards
- Implement tools and strategies to translate raw data into valuable business insights
- Identifying patterns and trends in data sets
- Working alongside teams to establish business needs
- Provide recommendations to update current MIS to improve reporting efficiency and consistency
Requirement / Desired Skills:
- Solid experience in dashboarding and reporting; industry experience is a plus
- Knowledge of Excel and SQL; expertise with business intelligence tools
- Ability to analyse and interpret data.
- Problem-solving skills.
- Methodical and logical approach.
- Accuracy and attention to detail
- Willingness to learn and adapt to new technologies
Key Responsibilities :
- Development of proprietary processes and procedures designed to process various data streams around critical databases in the org
- Manage technical resources around data technologies, including relational databases, NO SQL DBs, business intelligence databases, scripting languages, big data tools and technologies, visualization tools.
- Creation of a project plan including timelines and critical milestones to success in support of the project
- Identification of the vital skill sets/staff required to complete the project
- Identification of crucial sources of the data needed to achieve the objective.
Skill Requirement :
- Experience with data pipeline processes and tools
- Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, ETL, ESB)
- Experience with an existing ETL tool e.g Informatica and Ab initio etc
- Deep understanding of big data systems like Hadoop, Spark, YARN, Hive, Ranger, Ambari
- Deep knowledge of Qlik ecosystems like Qlikview, Qliksense, and Nprinting
- Python, or a similar programming language
- Exposure to data science and machine learning
- Comfort working in a fast-paced environment
Soft attributes :
- Independence: Must have the ability to work on his/her own without constant direction or supervision. He/she must be self-motivated and possess a strong work ethic to strive to put forth extra effort continually
- Creativity: Must be able to generate imaginative, innovative solutions that meet the needs of the organization. You must be a strategic thinker/solution seller and should be able to think of integrated solutions (with field force apps, customer apps, CCT solutions etc.). Hence, it would be best to approach each unique situation/challenge in different ways using the same tools.
- Resilience: Must remain effective in high-pressure situations, using both positive and negative outcomes as an incentive to move forward toward fulfilling commitments to achieving personal and team goals.
This position is for Big Data Engineer/Lead specialized in Hadoop, Spark and AWS Data Engineering technologies with 3 to 12 years of experience.
Roles & Responsibility
For this role, we require someone with strong product design sense. The position requires one to work on complex technical projects and closely work with peers in an innovative and fast-paced environment.
- Grow our analytics capabilities with faster, more reliable data pipelines, and better tools, handling petabytes of data every day.
- Brainstorm and create new platforms and migrate the existing ones to AWS, that can help in our quest to make data available to cluster users in all shapes and forms, with low latency and horizontal scalability.
- Make changes to our data platform, refactoring/redesigning as needed and diagnosing any problems across the entire technical stack.
- Design and develop a real-time events pipeline for Data ingestion for real-time dash-boarding.
- Develop complex and efficient functions to transform raw data sources into powerful, reliable components of our data lake.
- Design & implement new components and various emerging technologies in AWS, and Hadoop Eco System, and successful execution of various projects.
- Optimize and improve existing features or data processes for performance and stability.
- Conduct peer design and code reviews.
- Write unit tests and support continuous integration.
- Be obsessed about quality and ensure minimal production downtimes.
- Mentor peers, share information and knowledge and help build a great team.
- Monitor job performances, file system/disk-space management, cluster & database connectivity, log files, management of backup/security and troubleshooting various user issues.
- Collaborate with various cross-functional teams: infrastructure, network, database.
Must have skills: Python, AWS, Scala, Spark, Hadoop, Big Data Analytics
Desired Skills
- Fluent with data structures, algorithms and design patterns.
- Strong hands-on experience with Hadoop, MapReduce, Hive, Spark.
- Excellent programming/debugging skills in Java/Scala.
- Experience with any scripting language such as Python, Bash etc.
- Good to have experience of working with No SQL databases like HBase, Cassandra.
- Experience with BI Tools like AWS QuickSight, Dashboarding and Metrics.
- Hands on programming experience with multithreaded applications.
- Good to have experience in Database, SQL, messaging queues like Kafka.
- Good to have experience in developing streaming applications eg Spark Streaming, Flink, Storm, etc.
- Good to have experience with AWS and cloud technologies such as S3.
- Experience with caching architectures like Redis, Memcached etc.
- Memory optimization and GC tuning.
- Experience with profiling and performance optimizations.
- Experience with agile development methodologies and DevOps action.
SKILLS:
Mandatory Skills:
- Advanced Understanding of Adobe Analytics as an Architect or at the least Business Practitioner.
- Possess excellent analytical skills to understand the intricacies of the data and visualization.
- Advanced MS Excel skills.
- Good communication skills.
- Team Building & Handling
Desired Skills:
- Hands-on Visualization Tools like Tableau and Power BI.
- Must have understanding on Adobe DTM and Launch (Tag Management Systems).
- SQL – Basic.
What you’ll be responsible for:
- Monitor the site performance on a daily basis.
- Understand user behaviour and create actionable dashboards.
- Create and maintain daily/weekly/monthly reports.
- Analyse and Audit data to generate dashboards.
- Measure and report performance of marketing campaigns, gain insight and assess against goals.
- Hands-on knowledge with user behaviour tracking systems and methodologies.
- Identifying key needs or gaps and provide leadership to close those gaps.
Challenges (also responsible) you’ll be facing in the role:
- Understand client requirement and create ad hoc reports/Recommendations.
- Provide meaningful insights from data to make operational and strategic decisions for clients.
- Strong Understanding of various technologies being used in the digital transformation space.
- Should possess Data Visualization skills to help build interactive reports.
- Independent and proactive self-starter.
We are looking for a Business Intelligence (BI)/Data Analyst to create and manage Power Bl and analytics solutions that turn data into knowledge. In this role, you should have a background in data and business analysis. If you are self-directed, passionate about data,
and have business acumen and problem-solving aptitude, we'd like to meet you. Ultimately, you will enhance our business intelligence system to help us make better decisions.
Requirements and Qualifications
- BSc/BA in Computer Science, Engineering, or relevant field.
- Financial experience and Marketing background is a plus
- Strong Power BI development skills including Migration of existing deliverables to PowerBl.
- Ability to work autonomously
- Data modelling, Calculations, Conversions, Scheduling Data refreshes in Power-BI.
- Proven experience as a Power BI Developer is a must.
- Industry experience is preferred. Familiarity with other BI tools (Tableau, QlikView).
- Analytical mind with a problem-solving aptitude.
Responsibilities
- Design, develop and maintain business intelligence solutions
- Craft and execute queries upon request for data
- Present information through reports and visualization based on requirements gathered from stakeholders
- Interact with the team to gain an understanding of the business environment, technical context, and organizational strategic direction
- Design, build and deploy new, and extend existing dashboards and reports that synthesize distributed data sources
- Ensure data accuracy, performance, usability, and functionality requirements of BI platform
- Manage data through MS Excel, Google sheets, and SQL applications, as required and support other analytics platforms
- Develop and execute database queries and conduct analyses
- Develop and update technical documentation requirements
- Communicate insights to both technical and non-technical audiences.
Business Intelligence Developer
at Symansys Technologies India Pvt Ltd
Experience: 6-9 yrs
Location: NoidaJob Description:
- Must Have 3-4 Experience in SSIS, Mysql
- Good Experience in Tableau
- Experience in SQL Server.
- 1+ year of Experience in Tableau
- Knowledge of ETL Tool
- Knowledge of Dataware Housing