
Professional experience in Python – Mandatory experience
Basic knowledge of any BI Tool (Microsoft Power BI, Tableau etc.) and experience in R
will be an added advantage
Proficient in Excel
Good verbal and written communication skills
Key Responsibilities:
Analyze data trends and provide intelligent business insights, monitor operational and
business metrics
Complete ownership of business excellence dashboard and preparation of reports for
senior management stating trends, patterns, and predictions using relevant data
Review, validate and analyse data points and implement new data analysis
methodologies
Perform data profiling to identify and understand anomalies
Perform analysis to assess quality and meaning of data
Develop policies and procedures for the collection and analysis of data
Analyse existing process with the help of data and propose process change and/or lead
process re-engineering initiatives
Use BI Tools (Microsoft Power BI/Tableau) and develop and manage BI solutions

About Broadcast Media Production and Distribution Company
Similar jobs
- Extensive exposure to at least one Business Intelligence Platform (if possible, QlikView/Qlik Sense) – if not Qlik, ETL tool knowledge, ex- Informatica/Talend
- At least 1 Data Query language – SQL/Python
- Experience in creating breakthrough visualizations
- Understanding of RDMS, Data Architecture/Schemas, Data Integrations, Data Models and Data Flows is a must
Company Description
What We Do
Miratech helps visionaries to change the world. We are a global IT services and consulting company that brings together global enterprise innovation and start-up innovation. Today we support digital transformation for the largest enterprises on the planet.
By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. Our culture of Relentless Performance enables over 99% of Miratech’s engagements to succeed by meeting or exceeding scope, schedule and/or budget objectives since our inception in 1989.
Job Description
We are looking for a Senior Perl Developer to join our team, who will help us work on solutions and implement technologies that will improve user experience.
Responsibilities:
- Developing and maintaining large-scale Perl applications.
- Perform application modifications and enhancements based on business needs.
- Develop clean, high-quality and reusable codes based on programming standards.
- Coordinate with Project Manager to clearly understand business requirements and expectations.
- Stay abreast with latest trends in application development techniques and technologies.
- Suggest optimal application development solutions to meet or exceed business objectives.
- Develop best practices to ensure coding efficiency and quality.
- Analyze and resolve coding issues in a timely and accurate manner.
- Prepare and maintain coding documentations for reference purposes.
- Prioritize, plan and handle multiple tasks effectively.
- Ensure to complete the assigned development tasks within deadlines.
- Report project status to Manager on regular basis.
Qualifications
- 7+ years of experience in Perl development.
- Experience of developing and maintaining shell scripts.
- Strong working knowledge in Perl and Unix based systems.
- Experience of developing database-driven web services/applications against SQL databases such as Oracle or MySQL.
- Detail focused, experience of reviewing technical documentation, diagrams and plans in order to help meet and/or define requirements.
- Able to communicate technical information verbally in a clear manner to both technical and non-technical stakeholders.
About the Company
Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!
We are looking for a data scientist to join its growing team. This position will require you to think and act on the geospatial architecture and data needs (specifically geospatial data) of the company. This position is strategic and will also require you to collaborate closely with data engineers, data scientists, software developers and even colleagues from other business functions. Come save the planet with us!
Your Role
Manage: It goes without saying that you will be handling large amounts of image and location datasets. You will develop dataframes and automated pipelines of data from multiple sources. You are expected to know how to visualize them and use machine learning algorithms to be able to make predictions. You will be working across teams to get the job done.
Analyze: You will curate and analyze vast amounts of geospatial datasets like satellite imagery, elevation data, meteorological datasets, openstreetmaps, demographic data, socio-econometric data and topography to extract useful insights about the events happening on our planet.
Develop: You will be required to develop processes and tools to monitor and analyze data and its accuracy. You will develop innovative algorithms which will be useful in tracking global environmental problems like depleting water levels, illegal tree logging, and even tracking of oil-spills.
Demonstrate: A familiarity with working in geospatial libraries such as GDAL/Rasterio for reading/writing of data, and use of QGIS in making visualizations. This will also extend to using advanced statistical techniques and applying concepts like regression, properties of distribution, and conduct other statistical tests.
Produce: With all the hard work being put into data creation and management, it has to be used! You will be able to produce maps showing (but not limited to) spatial distribution of various kinds of data, including emission statistics and pollution hotspots. In addition, you will produce reports that contain maps, visualizations and other resources developed over the course of managing these datasets.
Requirements
These are must have skill-sets that we are looking for:
- Excellent coding skills in Python (including deep familiarity with NumPy, SciPy, pandas).
- Significant experience with git, GitHub, SQL, AWS (S3 and EC2).
- Worked on GIS and is familiar with geospatial libraries such as GDAL and rasterio to read/write the data, a GIS software such as QGIS for visualisation and query, and basic machine learning algorithms to make predictions.
- Demonstrable experience implementing efficient neural network models and deploying them in a production environment.
- Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
- Capable of writing clear and lucid reports and demystifying data for the rest of us.
- Be curious and care about the planet!
- Minimum 2 years of demonstrable industry experience working with large and noisy datasets.
Benefits
- Work from anywhere: Work by the beach or from the mountains.
- Open source at heart: We are building a community where you can use, contribute and collaborate on.
- Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
- Flexible timings: Fit your work around your lifestyle.
- Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
- Work Machine of choice: Buy a device and own it after completing a year at BSA.
- Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
- Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.
Datametica is Hiring for Datastage Developer
- Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
scraping , and problem skills
MSBI Developer-
We have the following opening in our organization:
Years of Experience: Experience of 4-8 years.
Location- Mumbai ( Thane)/BKC/Andheri
Notice period: Max 15 days or Immediate
Educational Qualification: MCA/ME/Msc-IT/BE/B-Tech/BCA/BSC IT in Computer Science/B.Tech
Requirements:
- 3- 8 years of consulting or relevant work experience
- Should be good in SQL Server 2008 R2 and above.
- Should be excellent at SQL, SSRS & SSIS, SSAS,
- Data modeling, Fact & dimension design, work on a data warehouse or dw architecture design.
- Implementing new technology like power BI, power bi modeling.
- Knowledge of Azure or R-programming is an added advantage.
- Experiences in BI and Visualization Technology (Tableau, Power BI).
- Advanced T-SQL programming skill
- Can scope out a simple or semi-complex project based on business requirements and achievable benefits
- Evaluate, design, and implement enterprise IT-based business solutions, often working on-site to help customers deploy their solutions.
Understanding PowerApps formulas and development methods. Ability to develop
POWER BI Min 2+ years of hand on experience in POWER BI Reports and dashboard development, MS SSAS, MS SSAS & POWER BI Candidate
Should have hands on experience on writing complex SQL queriesworkflows using PowerApps and Microsoft PowerAutomate.
Customizing SharePoint lists and disparate systems with PowerApps Knowledge of PowerApps and PowerAutomate licensing Automating business processes with Microsoft PowerAutomate.
Automating business processes with Microsoft PowerAutomate.
Ability to create custom connectors for Microsoft PowerAutomate.
Experience with API or rest services integrations.
Understanding of how to implement solutions with multiple data source Understanding PowerApps formulas and development methods
2. Should understand the importance and know-how of taking the machine-learning-based solution to the consumer.
3. Hands-on experience with statistical, machine-learning tools and techniques
4. Good exposure to Deep learning libraries like Tensorflow, PyTorch.
5. Experience in implementing Deep Learning techniques, Computer Vision and NLP. The candidate should be able to develop the solution from scratch with Github codes exposed.
6. Should be able to read research papers and pick ideas to quickly reproduce research in the most comfortable Deep Learning library.
7. Should be strong in data structures and algorithms. Should be able to do code complexity analysis/optimization for smooth delivery to production.
8. Expert level coding experience in Python.
9. Technologies: Backend - Python (Programming Language)
10. Should have the ability to think long term solutions, modularity, and reusability of the components.
11. Should be able to work in a collaborative way. Should be open to learning from peers as well as constantly bring new ideas to the table.
12. Self-driven missile. Open to peer criticism, feedback and should be able to take it positively. Ready to be held accountable for the responsibilities undertaken.

