11+ SOX 404 Jobs in Chennai | SOX 404 Job openings in Chennai
Apply to 11+ SOX 404 Jobs in Chennai on CutShort.io. Explore the latest SOX 404 Job opportunities across top companies like Google, Amazon & Adobe.
JOB SUMMARY: The Senior Associate supports the Data Analytics Manager by proposing relevant analytics procedures/tools, executing the analytics and also developing visualization outputs for audits, continuous monitoring/auditing and IA initiatives. The individual’s responsibilities include -
Understanding audit and/or project objectives and assisting the manager in preparing the plan and timelines.
Working with the Process/BU/IA teams for gathering requirements for continuous monitoring/auditing projects.
Working with Internal audit project teams to understand the analytics requirements for audit engagements.
Independently build pilot/prototype, determine appropriate visual tool and design the views to meet project objectives.
Proficient in data management and data mining.
Highly skilled on visualization tools like Qlik View, Qlik Sense, Power BI, Tableau, Alteryx etc.
Working with Data Analytics Manager to develop analytics program aligned to the overall audit plan.
Showcasing analytics capability to Process management teams to increase adoption of continuous monitoring.
Establishing and maintaining relationships with all key stakeholders of internal audit.
Coaching other data analysts on analytics procedures, coding and tools.
Taking a significant and active role in developing and driving Internal Audit Data Analytics quality and knowledge sharing to enhance the value provided to Internal Audit stakeholders.
Ensuring timely and accurate time tracking.
Continuously focusing on self-development by attending trainings, seminars and acquiring relevant certifications.
Position Overview: We are seeking a talented Data Engineer with expertise in Power BI to join our team. The ideal candidate will be responsible for designing and implementing data pipelines, as well as developing insightful visualizations and reports using Power BI. Additionally, the candidate should have strong skills in Python, data analytics, PySpark, and Databricks. This role requires a blend of technical expertise, analytical thinking, and effective communication skills.
Key Responsibilities:
- Design, develop, and maintain data pipelines and architectures using PySpark and Databricks.
- Implement ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes.
- Collaborate with data analysts and business stakeholders to understand data requirements and translate them into actionable insights.
- Develop interactive dashboards, reports, and visualizations using Power BI to communicate key metrics and trends.
- Optimize and tune data pipelines for performance, scalability, and reliability.
- Monitor and troubleshoot data infrastructure to ensure data quality, integrity, and availability.
- Implement security measures and best practices to protect sensitive data.
- Stay updated with emerging technologies and best practices in data engineering and data visualization.
- Document processes, workflows, and configurations to maintain a comprehensive knowledge base.
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or related field. (Master’s degree preferred)
- Proven experience as a Data Engineer with expertise in Power BI, Python, PySpark, and Databricks.
- Strong proficiency in Power BI, including data modeling, DAX calculations, and creating interactive reports and dashboards.
- Solid understanding of data analytics concepts and techniques.
- Experience working with Big Data technologies such as Hadoop, Spark, or Kafka.
- Proficiency in programming languages such as Python and SQL.
- Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud.
- Excellent analytical and problem-solving skills with attention to detail.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Ability to work independently and manage multiple tasks simultaneously in a fast-paced environment.
Preferred Qualifications:
- Advanced degree in Computer Science, Engineering, or related field.
- Certifications in Power BI or related technologies.
- Experience with data visualization tools other than Power BI (e.g., Tableau, QlikView).
- Knowledge of machine learning concepts and frameworks.
Location: Chennai
Education: BE/BTech
Experience: Minimum 3+ years of experience as a Data Scientist/Data Engineer
Domain knowledge: Data cleaning, modelling, analytics, statistics, machine learning, AI
Requirements:
- To be part of Digital Manufacturing and Industrie 4.0 projects across client group of companies
- Design and develop AI//ML models to be deployed across factories
- Knowledge on Hadoop, Apache Spark, MapReduce, Scala, Python programming, SQL and NoSQL databases is required
- Should be strong in statistics, data analysis, data modelling, machine learning techniques and Neural Networks
- Prior experience in developing AI and ML models is required
- Experience with data from the Manufacturing Industry would be a plus
Roles and Responsibilities:
- Develop AI and ML models for the Manufacturing Industry with a focus on Energy, Asset Performance Optimization and Logistics
- Multitasking, good communication necessary
- Entrepreneurial attitude
Additional Information:
- Travel: Must be willing to travel on shorter duration within India and abroad
- Job Location: Chennai
- Reporting to: Team Leader, Energy Management System
at AxionConnect Infosolutions Pvt Ltd
Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur
Notice period: Immediate - 15 days
1. Python Developer with Snowflake
Job Description :
- 5.5+ years of Strong Python Development Experience with Snowflake.
- Strong hands of experience with SQL ability to write complex queries.
- Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
- Development of Data Analysis, Data Processing engines using Python
- Good Experience in Data Transformation using Python.
- Experience in Snowflake data load using Python.
- Experience in creating user-defined functions in Snowflake.
- Snowsql implementation.
- Knowledge of query performance tuning will be added advantage.
- Good understanding of Datawarehouse (DWH) concepts.
- Interpret/analyze business requirements & functional specification
- Good to have DBT, FiveTran, and AWS Knowledge.
make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.
Responsibilities:
- Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
- Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
- Become an expert on data and trends, both internal and external to Kaleidofin.
- Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
- Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
- Automate scheduling and distribution of reports and support auditing and value realization.
- Partner with enterprise architects to define and ensure proposed.
- Business Intelligence solutions adhere to an enterprise reference architecture.
- Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks
Requirements:
- Experience leading development efforts through all phases of SDLC.
- 5+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
- Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
- Hands on experience in SQL, data management, and scripting (preferably Python).
- Strong data visualisation design skills, data modeling and inference skills.
- Hands-on and experience in managing small teams.
- Financial services experience preferred, but not mandatory.
- Strong knowledge of architectural principles, tools, frameworks, and best practices.
- Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
- Team handling preferred for 5+yrs experience candidates.
- Notice period less than 30 days.
Senior Software Engineer
MUST HAVE:
POWER BI with PLSQL
experience: 5+ YEARS
cost: 18 LPA
WHF- HYBRID
· 10+ years of Information Technology experience, preferably with Telecom / wireless service providers. · Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation
· To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc · The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers · Demonstrated ability to work collaboratively · Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.) · Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision · Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems |
Work Location : Chennai
Experience Level : 5+yrs
Package : Upto 18 LPA
Notice Period : Immediate Joiners
It's a full-time opportunity with our client.
Mandatory Skills:Machine Learning,Python,Tableau & SQL
Job Requirements:
--2+ years of industry experience in predictive modeling, data science, and Analysis.
--Experience with ML models including but not limited to Regression, Random Forests, XGBoost.
--Experience in an ML engineer or data scientist role building and deploying ML models or hands on experience developing deep learning models.
--Experience writing code in Python and SQL with documentation for reproducibility.
--Strong Proficiency in Tableau.
--Experience handling big datasets, diving into data to discover hidden patterns, using data visualization tools, writing SQL.
--Experience writing and speaking about technical concepts to business, technical, and lay audiences and giving data-driven presentations.
--AWS Sagemaker experience is a plus not required.
In this role, candidates will be responsible for developing Tableau Reports. Should be able to write effective and scalable code. Improve functionality of existing Reports/systems.
· Design stable, scalable code.
· Identify potential improvements to the current design/processes.
· Participate in multiple project discussions as a senior member of the team.
· Serve as a coach/mentor for junior developers.
Minimum Qualifications
· 3 - 8 Years of experience
· Excellent written and verbal communication skills
Must have skills
· Meaningful work experience
· Extensively worked on BI Reporting tool: Tableau for development of reports to fulfill the end user requirements.
· Experienced in interacting with business users to analyze the business process and requirements and redefining requirements into visualizations and reports.
· Must have knowledge with the selection of appropriate data visualization strategies (e.g., chart types) for specific use cases. Ability to showcase complete dashboard implementations that demonstrate visual standard methodologies (e.g., color themes, visualization layout, interactivity, drill-down capabilities, filtering, etc.).
· You should be an Independent player and have experience working with senior leaders.
· Able to explore options and suggest new solutions and visualization techniques to the customer.
· Experience crafting joins and joins with custom SQL blending data from different data sources using Tableau Desktop.
· Using sophisticated calculations using Tableau Desktop (Aggregate, Date, Logical, String, Table, LOD Expressions.
· Working with relational data sources (like Oracle / SQL Server / DB2) and flat files.
· Optimizing user queries and dashboard performance.
· Knowledge in SQL, PL/SQL.
· Knowledge is crafting DB views and materialized views.
· Excellent verbal and written communication skills and interpersonal skills are required.
· Excellent documentation and presentation skills; should be able to build business process mapping document; functional solution documents and own the acceptance/signoff process from E2E
· Ability to make right graph choices, use of data blending feature, Connect to several DB technologies.
· Must stay up to date on new and coming visualization technologies.
Pref location: Chennai (priority)/ Bengaluru
Ganit Inc. is the fastest growing Data Science & AI company in Chennai.
Founded in 2017, by 3 industry experts who are alumnus of IITs/SPJIMR with each of them having 17+ years of experience in the field of analytics.
We are in the business of maximising Decision Making Power (DMP) for companies by providing solutions at the intersection of hypothesis based analytics, discovery based AI and IoT. Our solutions are a combination of customised services and functional product suite.
We primarily operate as a US-based start-up and have clients across US, Asia-Pacific, Middle-East and have offices in USA - New Jersey & India - Chennai.
Started with 3 people, the company is fast growing with 100+ employees
1. What do we expect from you
- Should posses minimum 2 years of experience of data analytics model development and deployment
- Skills relating to core Statistics & Mathematics.
- Huge interest in handling numbers
- Ability to understand all domains in businesses across various sectors
- Natural passion towards numbers, business, coding, visualisation
2. Necessary skill set:
- Proficient in R/Python, Advanced Excel, SQL
- Should have worked with Retail/FMCG/CPG projects solving analytical problems in Sales/Marketing/Supply Chain functions
- Very good understanding of algorithms, mathematical models, statistical techniques, data mining, like Regression models, Clustering/ Segmentation, time series forecasting, Decision trees/Random forest, etc.
- Ability to choose the right model for the right data and translate that into code in R, Python, VBA (Proven capabilities)
- Should have handled large datasets and with through understanding of SQL
- Ability to handle a team of Data Analysts
3. Good to have skill set:
- Microsoft PowerBI / Tableau / Qlik View / Spotfire
4. Job Responsibilities:
- Translate business requirements into technical requirements
- Data extraction, preparation and transformation
- Identify, develop and implement statistical techniques and algorithms that address business challenges and adds value to the organisation
- Create and implement data models
- Interact with clients for queries and delivery adoption
5. Screening Methodology
- Problem Solving round (Telephonic Conversation)
- Technical discussion round (Telephonic Conversation)
- Final fitment discussion (Video Round