
3-5 years experience in the field of Accounting, Compliance's and Taxation.
Statutory Audit, Internal Audits, Tax Audits, Fixed Assets Audits, Finalization of books of accounts of non-corporate entities including Cash Flow Statements. Preparation & Filing of Income Tax Returns, GST Returns, TDS, Finance and Company matter related assignments as well as Concurrent Audits.
A good working knowledge of Microsoft Office/Excel.
Self-motivated and good written and verbal communication skills.
Ability to handle multiple task.
Have excellent knowledge of Tally Software, GST, Bank Reconciliation, Filling, TDS, Tax, SAP etc.

About Cosmo Infrasolution Pvt Ltd
About
Connect with the team
Similar jobs
Preferred Education & Experience:
•Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics or related technical field or equivalent practical experience.
Well-versed in and 5+ years of hands-on demonstrable experience with:
▪Data Analysis & Data Modeling
Database Design & Implementation
Database Performance Tuning & Optimization
▪PL/pgSQL & SQL
•5+ years of hands-on development experience in Relational Database (PostgreSQL/SQL
Server/Oracle).
•5+ years of hands-on development experience in SQL, PL/PgSQL, including stored procedures,
functions, triggers, and views.
Hands-on experience with demonstrable working experience in Database Design Principles, SQL
Query Optimization Techniques, Index Management, Integrity Checks, Statistics, and Isolation
levels.
Hands-on experience with demonstrable working experience in Database Read & Write
Performance Tuning & Optimization.
•Knowledge and Experience working in Domain Driven Design (DDD) Concepts, Object Oriented
Programming System (OOPS) Concepts, Cloud Architecture Concepts, NoSQL Database Concepts
are added values
•Knowledge and working experience in Oil & Gas, Financial, & Automotive Domains is a plus
Hands-on development experience in one or more NoSQL datastores such as Cassandra, HBase,
MongoDB, DynamoDB, Elastic Search, Neo4J, etc. a plus
Job Location : Pune/Remote
Work Timings : 2.30 pm-11:30 pm
Joining Period : Immediate-20 day
iNFX is a set of micro-services that (mainly) will run on our embedded Linux devices – our payment terminals and edge devices. The purpose is to provide key software building blocks for the things our customers need to do on their retail sites – like processing payments, integrating with devices on the forecourt (petrol dispensers, EV chargers, Tank gauges, price signs), running the workflow on payment terminals and providing for mobile payments & customer engagement.
The team we are looking for would be working on the C++ based microservices. Experience with payments processing would be very beneficial.
While we ideally want C++ developers, it is important to note this is Application development rather than traditional “embedded” development. We want people who understand APIs, protocols, and ideally payments. Ideally having worked with Linux, although Kernel driver development experience is not so important. Neither is bare-metal / low-level firmware experience. The software is also intended to be portable to Windows environments - so experience in cross platform development is important.
No need to target Assembler skills for the Application Engineer, good C++ & Linux Application development skillsets are important but experience in the payment domain is an truly an advantage some of payment domain experience/expertise would be - understanding of ISO8583 type messaging, payment transaction types, payment software development exposure, EMV understanding, Payments security and certification etc
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
- Meeting with the design team to review website and application requirements.
- Setting tasks and development goals.
- Configuring the company SharePoint systems to specified requirements.
- Developing new web components using XML, .NET, SQL, and C#.
- Designing, coding, and implementing scalable applications.
- Extending SharePoint functionality with forms, web parts, and application technologies.
We are looking for a Senior Platform Engineer responsible for handling our GCP/AWS clouds. The candidate will be responsible for automating the deployment of cloud infrastructure and services to support application development and hosting (architecting, engineering, deploying, and operationally managing the underlying logical and physical cloud computing infrastructure).
Job Description:
● Collaborate with teams to build and deliver solutions implementing serverless, microservice-based, IaaS, PaaS, and containerized architectures in GCP/AWS environments.
●Responsible for deploying highly complex, distributed transaction processing systems.
● Work on continuous improvement of the products through innovation and learning. Someone with a knack for benchmarking and optimization
● Hiring, developing, and cultivating a high and reliable cloud support team ● Building and operating complex CI/CD pipelines at scale
● Work with GCP Services, Private Service Connect, Cloud Run, Cloud Functions, Pub/Sub, Cloud Storage, Networking
● Collaborate with Product Management and Product Engineering teams to drive excellence in Google Cloud products and features.
● Ensures efficient data storage and processing functions by company security policies and best practices in cloud security.
● Ensuring scaled database setup/monitoring with near zero downtime
-
Design and implement software of embedded devices and systems from requirements to production and commercial deployment.
-
Design, develop, code, test and debug system software Review code.
-
Support software QA and optimize I/O performance.
-
Interface with hardware design and development Assess third party and open source software.
-
hands-on development and troubleshooting on embedded targets Solid programming experience in C or C++.
-
Experience in programming Embedded C/C++ applications with strong background in C/C++ inheritance, templates and pointers.
-
Strong in OS concepts like efficient multi-threading and resource-sharing.
-
Experience in working with firmware, application and board support packages.
-
Expert knowledge on protocols like (RS485 SPI, I2C, ADCs, PWM, CAN).
-
Embedded development tools and methodologies. • Version Control Systems - Clear Case / RTC /GIT.
-
Strong development experience in embedded C and RTOS. • Should have detailed knowledge and experience on microcontrollers / microprocessors ( 16 bit/32-bit).
-
Experience in configuration management & defect tracking tools.
-
Strong debugging skills.
-
Must be a self-starter.
- Require Angular (2+) Developers having experience of min. 2 years for a long term project.
- The candidate should be able to work independently for the given task.
- Job Location: Gandhinagar, Gujarat.
About Company
- VedikIn Solution is one of the leading web development & web hosting company providing a large number of services starting from web development to mobile app, eCommerce platform, cloud solutions, web hosting services, etc.
- We have a better work culture of fixed working hours, ideas flowing free flow, monthly entertainment trips & continuous appreciation for good work.
- We are located in Gandhinagar, Gujarat since 2016. As part of the expansion, we are currently hiring passionate and talented developers who want to join VedikIn's exponential growth & adventurous journey.
Responsibilities
- The candidate should be able to complete assigned work independently in a timely manner.
- The candidate should be able to communicate with the client independently.
Benefits
- Traveling allowance
- Monthly fun activities
- Monthly team building activities along with team lunch/dinner
- Flexible work timing
Job Type: Full-time
Salary: Up to ₹70,000.00 per month
COVID-19 considerations:
Mask is compulsory, Regular sanitization work is done at the office.
Key Skills:
o Determining the structure and design of web pages.
o Ensuring user experience determines design choices.
o Developing features to enhance the user experience.
o Striking a balance between functional and aesthetic design.
o Ensuring web design is optimized for smartphones.
o Building reusable code for future use.
o Optimizing web pages for maximum speed and scalability.
o Utilizing a variety of markup languages to write web pages.
o Maintaining brand consistency throughout the design.
o Understanding of key design principles.
o Experience with Angular and React.
o Proficiency in HTML, CSS, JavaScript, and jQuery.
o Understanding of server-side CSS.
o Experience with responsive and adaptive design.
o Understanding of SEO principles.
o Good problem-solving skills.
• Good to have skills:
o Knowledge of additional framework like ViewJS and NodeJS is a big plus.
o Knowledge of PHP - object orientated concepts and frameworks such as Laravel or
CodeIgniter.
o MySQL concepts
o Understanding accessibility and security compliance
- Roles & Responsibilities -
* We are looking for full-stack/back-end developers.
* Our ideal candidate will be able to build applications from scratch and deliver
a complete project i.e. understand requirements, design architecture, write reusable code that follows basic coding standards, test, deploy and support/follow-up features.
* As a start-up, we are keen to work with people who are passionate about technology and who love the speed, chaos, and versatility of start-ups.
Skills and Qualifications -
* Team player with strong work ethic; Detail and deadline-oriented; Take ownership of tasks.
* Good with version control systems like Git.
* Professional understanding of CS fundamentals, data structures and algorithms.
* Knowledge of Python is preferred.
* Hands-on experience with front-end technologies - HTML5, CSS3, Bootstrap library & JavaScript/jQuery.
* Knowledge of Angular 2/ReactJs is a plus.
* Knowledge of Rest API is a plus.
Perks -
* Competetive compensation
* MaBook
* Friendly Leave Policy









