R2025a Lessons Learned

R2025a Lessons Learned

How  Lessons Learned are Managed

GMAT lessons learned include things that we did well and should keep doing, and large scale things we should be doing to improve the software or our project.   Lessons learned are each discussed by the team and if we decide there is a real issue, we require a plan for improvement.   To make sure we are efficiently handling lessons learned, here are some high level guidelines for creating them.

What is a Lesson Learned

Lessons learned are issues that cause significant problems or could have caused significant problems, or are issues where we did something that significantly improved the software or our project.   Lessons learned require group discussion and probably a change in team habits, process or strategy.

Lessons learned satisfy one the following criteria:

  • Issue that is putting the project at greater risk than necessary

  • Issue that is causing significant inefficiency

  • Issue that is significantly lowering quality

What is Not a Lesson Learned

A lesson learned is not a minor annoyance, a tweak to an existing process, or something that can be resolved between team members in the everyday process of getting work done. Team members should bring these types of issues up at meetings, or work them among the team members involved.

A minor issue, (i.e.  not a lessons learned), satisfies one of these criteria:

  • Tweak to an existing process

  • Minor annoyance or gripe

  • Can be resolved by just picking up the phone, or discussing via email, or weekly meeting

  • Does not require significant change in habits or processes

Things We Should Keep Doing

  • [EGD] Continue to create the Beta/Public software release paperwork at the beginning of the release cycle

  • [EGD] Track work items big and small in Jira with the understanding that several different types of people view to tickets (i.e.,

Things We Should Change

Do Better

  1. [EGD] update third party dependencies earlier in the release development cycle

    • The updating of wxwidgets late in the release cycle lead to the team having to scramble to address issues found in the GUI.

    • Using old version of third party dependencies can lead to incompatibility with existing operating systems and other dependencies that we are supporting

  2. [DJC] having someone run the development builds of GMAT

    • The smaller the pool of testers for GMAT between builds we run the risk of missing the way users interact with the software that doesn’t necessarily get caught in automated regression testing

    • "Eating your own dog food" => use the tool you are developing so you have a higher quality product

  3. [EGD] test system setup to run on different python versions and system configurations

    • We currently have many machines we can perform test runs on and we should leverage the different machines when running the full suite of tests, especially on Windows since those are more available

  4. [EGD] documentation of running the test system

    • For a critical system, we do not have many people on the team that thoroughly know how to make use of it. Cross training is needed for this area so when one person is unavailable others can step in to complete the work.

  5. [DJC] Branch for the release using a naming convention not likely to have tag conflicts (e.g. branch name GMAT-R2025a, tag name R2025a, which follows the convention use on SF)

  6. [PJC] Release kick-off meeting to go through the release checklist and who will be in charge of each release section

    • Performing a detailed walkthrough can provide members unfamiliar with the release more insight in the process and can highlight areas assistance is needed.

    • When staffing is limited various team members will be needed to fill in the gaps as necessary.

  7. [TR] Providing GUI test manager more run time to perform regression tests before release day (~1 1/2 to 2 month)

    • Specific time should carved out for the milestone to kick off the GUI tests needed to support the release. This current release was uncertain until very late in CY 2024.

  8. [EGD] Having additional Mac expertise to deal with the anomalies unique to the Apple ecosystem

    • Ramping up development for an ecosystem with limited to no experience can use resources faster that desired versus getting dedicated senior expertise that could guide/train some engineers for a limited duration.

  9. [DJC] Scanning the release package to make sure proprietary libraries are not included (e.g., SNOPT). Automated via test system run or manual checklist?

    • Developing ways to minimize human errors especially during high stress and limited time moments will lead to better quality.

  10. [EGD] Clearly identify for punch list tickets nice to have versus essential items.

    • Chasing perfection of a particular work item with limited funds can lead to less deliverables to customers and the stretching thin of already limited resources

    • Resource loading will have to be performed more accurately to minimize this happening.

Start Doing

  1. [PJC] allocating sufficient time to train across the team (team training, cross competency)

  2. [EGD] Prioritize external outreach to potential customers

Stop Doing

  • [EGD] Implementing functionality only for a single customer that makes its usage strictly limited to that customer for a feature other customers will also want..

  • [EGD] Drilling down aiming for perfection of an implementation once the “good enough” implementation has been achieved. “Good enough” still includes implementation that will not break the system’s architecture and follows agreed upon design principles.