Loading...
 
Share this Job

Sr App Development Analyst

Date:  Nov 15, 2021
Job ID:  2240
Location: 

Bellevue, WA, US, 98004

Puget Sound Energy is looking to grow our community with top talented individuals like you!  With our rapidly growing, award winning energy efficiency programs, our pathway to an exciting and innovative future is now.

As a federal contractor, PSE is requiring that all employees be fully vaccinated against COVID-19 – and provide proof. As a condition of employment, you will need to provide proof of vaccination. However, accommodations will be considered for those unable to be vaccinated due to a disability/medical condition or sincerely held religious belief, practice or observance.

PSE's IT Application Solutions team is looking for qualified candidates to fill an open Sr App Development Analyst position!

Job Description

Puget Sound Energy, Bellevue, WA seeks Senior Application Development Analysts - Performing requirement gathering, gap analysis and risk analysis; and providing work and cost estimates for multiple business groups. Meet with various business groups to understand complex business challenges and requirements for them to better perform their jobs from a data and/or technical point of view.

Job Responsibilities

Create technical architecture to support these requirements on AWS and Azure cloud-based platforms to include extracting data from AWS Athena Cloud application to on premise SAP HANA database using a workgroup connection as a standard implementation for Client and implementing data flows from AWS Athena to SAP HANA database and supporting Hypercare phase for the project to utility client. Analyze the gaps in existing architecture and document those gaps. Provide work and cost estimates needed to provide the solution to the business. Evaluating and implementing new technology to provide better technical solutions and great user experience by conducting storage planning for SAP HANA database by forecast estimation of data flow analysis, maintenance, measuring queries and cluster optimization techniques. Analyze various new technologies (Big Data, HADOOP & Advanced Analytics, AWS, ETL, Scala and Python) to choose the most optimum, efficient and cost-effective solution for the various new and existing requirement and implementing python scripts to automate the ETL process on AWS web tools EC2, Glue, Athena and Lambda. Create proof of concept (POC) and showcase to business and technical managers. After successful POC approvals, create implementation plan to deploy the technology. Train users and peers to use new technology. Designing, developing, modifying, debugging and evaluating programs for functional or operational areas to include implementing AWS tools to manage and create data science related use cases. Implementing Cloud data warehouse using AWS tools S3, Glue, Athena & Lambda and create technical designs to analyze complex business problems and develop backend systems using AWS, EC2, HADOOP, ETL, CloudWatch, Python, etc., ensuring most scalable and optimum back-end systems. Providing efficient production support, issue resolution, working on data requests, perform code and process optimizations. Provide production support and resolve incidents, issues with Big Data/Data lake and the PSE Data team and develop Scala scripts to process terabytes worth of CSV data into parquet format compression methods for cloud application AWS Athena data warehouse. Work on process improvements, technical performance optimizations, working towards more automated processes over manual ones. Use incident-tracking tools such as SNAP to document and resolve issues. Ensure issue resolution is within SLA's agreed with business. Developing testing methodologies, quality control standards and monitoring and writing testing scripts to test development before promoting to production. Develop testing scripts to validate backend and front-end reporting and analytical solutions via automatic and manual routes for big projects and numerous enhancements and prepare disaster recovery plans for on-cloud based data warehouse and create python scripts and test cases. Set up quality monitoring standards and develop alerts for these. Test deliverables thoroughly and document test results in TFS accordance to testing/audit standards. Providing accountability for successful application upgrades, system refreshes, data migration, support/security patch fixes. Work on creating use cases, project plan and estimates and upgrading existing systems to provide better user experience, more cost optimization and increase business productivity. Work on data migration to support multi-tiered SAP environment b. to include implementing python script to reconcile Tables of Oracle DB, AWS Athena Cloud application and on-premise SAP HANA database and automating the process with daily reconciliation report and perform data extraction, transformation and loading from legacy systems to AWS S3 cloud storage using SAP BODS. Implement security or support patch fixes at earliest to provide stable secured technical environment such as applying/testing patches.

Additional Responsibilities

Develop reports and design document for Customer Insights Team using SAP HANA by implementing Stored Procedures, Flow graphs and daily automation in order to provide visualization, reporting and dashboard solutions to solve analytics needs of the various business groups. Work on customer outage dashboard using Outage (both electric/gas) data. Creating and updating technical development standards, including coding and visualization standards, as well as strategy and governance documents, as well as ensuring standards are met while developing solutions. Develop and update strategy and governance standards/templates. Document new technical changes and enhancements to existing systems to meet change control and audit standards. Develop solutions using python scripts for end users by implementing business reports based on resolving incidents, problems and supporting IT Business Teams through UAT and Work on incidents and resolve any open issues that financial planners may have.

Minimum Qualifications

Requires Master’s (or foreign educ. equiv.) Degree in Technology, Computer Science, Software Engineering or closely related field and two (2) yrs. experience in Job Offered or related.  Alternatively, will accept Bachelor’s (or foreign educ. equiv.) Degree in Technology, Computer Science, Software Engineering or closely related field and five (5) yrs. (post-degree, progressive) experience in Job Offered or related. Experience must have included Extracting data from AWS Athena Cloud application to on premise SAP HANA database using a workgroup connection as a standard implementation for Client and implementing data flows from AWS Athena to SAP HANA database and supporting Hypercare phase for the project to utility client; Implementing python script to reconcile Tables of Oracle DB, AWS Athena Cloud application and on-premise SAP HANA database and automating the process with daily reconciliation report; Implementing Cloud data warehouse using AWS tools including S3, Glue, Athena and Lambda by extracting data from Oracle DB using SAP Business Objects for Client Utility data migration; Creating python scripts to automate the ETL process on AWS web tools EC2, Glue, Athena and Lambda; Developing Scala scripts to process terabytes worth of CSV data into parquet format compression methods for cloud application AWS Athena data warehouse; Prepare disaster recovery plans for on-cloud based data warehouse and create python scripts and test cases; Developing solutions using python scripts for end users by implementing business reports based on resolving incidents, problems and supporting IT Business Teams through UAT; Performing data extraction, transformation and loading from legacy systems to AWS S3 cloud storage using SAP BODS; Developing reports and design document for Customer Insights Team using SAP HANA by implementing Stored Procedures, Flow graphs and daily automation; Conducting storage planning for SAP HANA database by forecast estimation of data flow analysis, maintenance, measuring queries and cluster optimization techniques; Implementing AWS tools to manage and create data science related use cases including predictive power outage analysis and maintenance reports to Clients and involved in all phases of development and Hypercare support activities of the project; and Implementing cloud data warehouse using distributions including Cloudera, Hortonworks and Amazon Web Services and Hadoop Framework and Spark programming. 

Families and businesses depend on PSE to provide the energy they need to pursue their dreams. Our steadfast commitment to serving Washington communities with safe, dependable and efficient energy started in 1873. Today we're building the Northwest's energy future through efforts like our award winning energy efficiency programs and our leadership in renewable energy.

 

At PSE we value and respect our employees and provide them opportunities to excel. We offer an expansive pay package that includes competitive compensation, annual goals-based incentive bonuses, comprehensive benefits, 401(K), a company paid retirement pension plan, and an employee assistance and wellness program.

 

Puget Sound Energy is committed to providing equal employment opportunity to all qualified applicants. We do not discriminate on the basis of race, color, religion, sex, national origin, age, sexual orientation, gender identity, marital status, veteran status or presence of a disability that with or without reasonable accommodation does not prevent performance of the essential functions of the job, or any other category prohibited by local, state or federal law.

 

Should you have a disability that requires assistance and/or reasonable accommodation with the job application process, please contact the Human Resources Staffing department at jobs@pse.com or 425-462-3017.


Nearest Major Market: Seattle
Nearest Secondary Market: Bellevue