Software Solution Architect

Location: Remote.           Type: Full-time

MUST submit a Resume in English, Work hours are in the US PST time zone, California.

We are seeking a highly skilled and motivated Product Solution Architect with extensive expertise in designing and implementing cutting-edge big-data solutions on the Amazon Web Services (AWS) platform. 

As a key member of our client's dynamic team, you will play a pivotal role in driving the success of our data-driven initiatives, leveraging AWS services to build scalable, reliable, and high-performance big-data solutions. 

Accountabilities Solution Design: Collaborate with cross-functional teams, including product managers, engineers, and data scientists, to understand business requirements and translate them into innovative and efficient big-data solutions on AWS. 

 

Mandatory skills:

Architect, Product Solution Architecting, Python, Pyspark, Airflow, AWS Glue.

 

Architecture Development:

  • Design and architect AWS-based big-data solutions, ensuring they are well-architected, cost-effective, and aligned with industry best practices.  
  • Leverage your knowledge of AWS services such as Amazon S3, EMR, Glue, Redshift, DynamoDB, and others to craft robust architectures. 

Technology Evaluation: 

  • Stay up-to-date with the latest AWS services, big-data technologies, and best practices. 
  • Evaluate and recommend appropriate tools and technologies to enhance data processing, storage, and analytics capabilities. 
  • Implementation and Deployment: 
  • Coach the technical implementation of big-data solutions on AWS, taking ownership of the end-to-end development lifecycle, from prototyping to production deployment. 
  • Performance Optimization: Identify performance bottlenecks and optimize the big-data solutions to achieve optimal performance, scalability, and cost-efficiency. 

Security and Compliance: 

  • Ensure that the solutions adhere to stringent security and compliance standards, protecting sensitive data and adhering to industry-specific regulations. 
  • Collaboration and Communication: Work closely with stakeholders to understand their needs, provide technical guidance, and communicate complex technical concepts in a clear and understandable manner. 

 

Troubleshooting and Support:

  • Assist in troubleshooting and resolving technical issues related to big-data solutions, offering timely support to maintain system uptime and reliability. 
  • Familiarity with the AWS Ecosystem: Having a deep understanding of the AWS ecosystem, including training jobs, processing jobs, and Sagemaker, will be a significant advantage. This knowledge will allow you to leverage AWS services efficiently and optimize data workflows.
  • Good Exposure and hands-on working on the following skills on Glue, Glue Catalog, Crawler, Lambda, Airflow, IAM, S3, Athena, RedShift, Python, PySpark, SQL knowledge, Dynamo DB, Extensive knowledge on applying data transformations, GIT/Bit Bucket
  • Provide technical leadership to the data engineering team, mentoring junior engineers, and fostering a collaborative and innovative environment. Collaborate with stakeholders to understand business needs and translate them into technical requirements.
  • Ensure data quality, accuracy, and consistency throughout the data pipeline.
  • Implement and enforce data security measures and compliance standards. 
  • Create and maintain comprehensive documentation for data engineering processes, data models, and system architecture.

 

Qualifications:

  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 
  • Proven experience as a Solution Architect or similar role, with a focus on designing and implementing big-data solutions on AWS. 
  • Strong expertise in AWS services relevant to big-data, including but not limited to Amazon S3, EMR, Glue, Sagemaker, Redshift, Athena, DynamoDB, etc. 
  • Proficiency in data modeling, data warehousing, and data integration concepts. 
  • Hands-on experience with programming languages like Python, Java, or Scala for big-data processing and analytics. 
  • Familiarity with big-data processing frameworks such as Apache Spark, Hadoop, or Apache Flink. 
  • Solid understanding of data security, encryption, and compliance best practices on AWS. 
  • Excellent problem-solving skills and the ability to think critically to address complex technical challenges. 
  • Strong communication and interpersonal skills, with the ability to collaborate effectively with both technical and non-technical stakeholders. 

CKCODECONNECT is an Equal Opportunity Employer and does not discriminate based on race, age, color, religion, sex, sexual orientation, gender identity, national origin, veteran, disability status, or any other characteristic protected by applicable law.

We will guide you in your career journey!

  Apply Now

Apply for this position

*
*
* Attach your resume. Max size 2mb Allowed Type(s): pdf