You ask, we answer – Join our free sustainability & LCA chat sessions.

10 common LCA mistakes and how to avoid them

For a successful LCA, you not only need primary data and reliable software but also start with a thorough understanding of your goal and scope. Preventing common LCA mistakes can increase your confidence in your results - learn how!

Capacity & Awareness

Like anything data-related, life cycle assessment (LCA) results are susceptible to mistakes or oversights along the way. Bewaring common mistakes and self-assessing your performance can save you money on professional reviews, and increases your hands-on LCA knowledge! To get a reliable and accurate LCA, in this article, you’ll learn about the most common mistakes (such as methodological inconsistency, input-data errors, and neglected steps) and how to avoid them.

Disclaimer: Self-assessment does not replace verification or critical review. However, they can smoothen the verification process down the road.

1: Not choosing (the right) LCA standard or Product Category Rules

In LCA, there are several standards and guidelines. They differ by industry and country and can be confusing for the newcomer. But ignoring them is rarely a good strategy!

Consequences of this mistake

LCA is a scientific approach but can be executed in varying ways. The point of having standards is to unify the methodology of LCA assessments within e.g. an industry, so that we can compare their results.

To compare your products to competitors in your industry, your LCA needs to follow the same methods they followed. Certain use cases, such as creating EPDs or making environmental claims, also require specific methodologies. If the LCA is solely for internal use, following standards might not matter as much.

How to prevent

It’s crucial to choose your methodology early on (during the “Goal and Scope” phase of your LCA) because your further steps depend on it.

Thus, do your research: What methods, standards, guidelines, or “Product category rules” (PCRs) apply in your industry? Or in your country? Which rules apply to your intended LCA-use case? This topic is broad: read all about it in our Help Center.

If no standards or PCRs apply, you can refer to the general ISO14040/14044 LCA guidelines.

2: Not following your chosen standard correctly

Your chosen LCA standard or PCRs can determine your LCIA method, database (or even specific datasets), or influence your functional unit and system boundaries!

Consequences of this mistake

If you misapply your standard, your LCA won’t be comparable to other LCAs within your industry. Auditors will raise these issues, and you might have to re-do certain aspects, prolonging the review process.

How to prevent

Read the documentation of your chosen methodology and implement it thoroughly. Ask your colleagues, the standard publisher, or our specialists in case of doubt.

And remember to change the default settings of your LCA software to the right LCIA method and database.

3: Wrong scope

Choosing your assessment scope is part of the “Goal and Scope” phase. Your LCA standard can influence this, too.

Mistakes here are to erroneously exclude relevant product aspects from your scope or include those that are not relevant to your chosen “life cycle model” (Cradle-to-Gate, Cradle-to-Grave, Cradle-to-Cradle).

Consequences of this mistake

Redundant product aspects will distort your results. Missing aspects will make your results incomplete.

How to prevent

Making a flowchart of your scope can help prevent such mistakes. It shows you the processes and materials needed for your product, and in which life cycle phase they occur. To check your scope, go back to this flowchart: did you include each aspect in your system model? Are there aspects in your system model that are not on your flowchart? If so, add them to see if they lie within your scope.

4: Database inconsistency

The methodological choices behind databases are not necessarily consistent with other databases or certain LCIA methods. Use of datasets from another database than your chosen one…

  1. is a mistake when you do it accidentally or in the wrong way. (e.g. you might be tempted to fill in datasets missing from your chosen database, or mischievously because the same dataset has less impact when using a different database).
  2. can be a valid approach to deal with missing datasets, when methodological mismatches are accounted for. This requires advanced LCA knowledge or specialist support, and transparent documentation.
Consequences of this mistake

Not accounting for methodological mismatches will make your LCA results incorrect.

How to prevent

Consistently use the database that is prescribed by your chosen LCA standard or PCR. Do not mix different versions of your database (e.g. Ecoinvent 3.4 with 3.8)  – stick to one version.

If nothing is prescribed, the most recent version of Ecoinvent (as of November 2023, this is v. 3.8 in Ecochain software) is recommended, as it is the largest and most transparent database.

5: Not doing Sanity Checks

When looking at your hotspots analysis (or “flat view” in Mobius), you might find something unexpected or unusual (maybe even “insane”), such as:

  • a tiny product aspect having huge impacts
  • a big product aspect, e.g. the main raw material, having tiny or no impacts

Not digging deeper here is a mistake!

Consequences of this mistake

Unexpected results can indicate mistakes in the system model. Usually either caused by wrong primary input data or wrong dataset use.

How to prevent

Unexpected results are often caused by typos in the input numbers, or mistakes in unit conversions. For example

  • Inputting numbers in a different unit than the dataset uses (the database might use g while you use kg, or even other types of units such as area (m2) vs. weight).
  • Neglecting the factor 1000 between m^3 and liters, or kWh and MWh.
  • Confusing . and ,

Thus, convert your units!

Consulting published LCA studies on similar products can help you get a feeling for what you “should” expect.

6: Using suboptimal datasets

Unusual or unexpected (Mistake nr. 5) results can be caused by unsuitable reference datasets. Your datasets might be

  • Out of date (e.g. from an outdated Ecoinvent version)
  • From a geographical scope that isn’t your scope
  • Not the best available match for your product system
Consequences of this mistake

Materials and production methods change over time and geography. Using a suboptimal dataset means your product impacts are not as well-reflected as possible.

How to prevent

Improve geographical scope

Many datasets are available for different regions of the world. Double-check for which regions yours are available, and choose the one closest to your market. This is especially relevant for national electricity datasets!

Improve temporal scope

Check the dataset’s creation date in the documentation – has production technology changed significantly since then?

Tip: Ecochain specialists can help you change the temporal or geographic validity of your background datasets, for example by providing you with a dataset where they changed the electricity mix that goes into your ingredient to another year or country.

Consider supplier EPDs

Using LCA results or EPDs from your suppliers*, instead of relying on average reference datasets, makes your LCA much more accurate. In Mobius, you can input EPD data using the ‘custom impacts’ feature.

*Supplier data needs to be methodologically consistent with your LCA – that’s what industry standards promote! 

7: Sloppy data documentation

Sloppy data documentation leads to chaos, blunders, and intransparency.

Consequences of this mistake

You miss out on the advantages of solid data documentation. These are:

  • Tracing back mistakes. For example, having found a mistake during sanity checks, you can see why you made it and if it repeats elsewhere.
  • Showing data uncertainties, and thus where sensitivity analysis (see Mistake Nr. 9) can add value.
  • Helping you create a transparent LCA report, which is fundamental for verifying and communicating LCA results.
How to prevent

Document each number, calculation, and assumption used in the LCA. Where does this data come from (references), are there conflicting references, and how certain are you of their accuracy (would this data point benefit from further research in future LCA iterations)?

While your LCA software is a great place to bring your data together, tool-external documentation in e.g. Excel gives more room for notes, links, and calculations.

8: Not involving colleagues

Being on your own little island makes the LCA process less fun, and more prone to mistakes – 4 eyes see more than 2!

Consequences of this mistake

Not involving colleagues might mean you make illogical assumptions and overlook mistakes your colleague would have noticed.

How to prevent

While creating your system model, you need to make some assumptions or estimations. For example on uncertain input data and the applicability of datasets. Explain to a colleague why and how you came to your assumptions, and invite them to be critical. Together, you might discover flawed thinking, or find confirmation.

Let your colleague help you during this self-checking process. A company’s employees know its products best and are equipped to notice missing aspects (Mistake Nr. 3) and unexpected results (Mistake Nr. 5).

9: Missing interpretation

The Interpretation phase is a crucial component of any LCA. Skipping it is a mistake.

Consequences of this mistake

Taking LCA results at face value will leave you clueless about how susceptible your results are to your data uncertainties. Not knowing the quality and conclusions of your results, your audience might take inappropriate (in)action.

How to prevent

Conduct sensitivity analysis

Sensitivity analyses assess how much influence data point variations (e.g. uncertain assumptions, possible data ranges, or alternative reference datasets) have on your LCA results.

Some LCA standards (such as the SBK Bepalingsmethode) prescribe specific sensitivity analyses. This ties back to Mistake Nr. 2-  applying standards correctly.

Draw conclusions and discuss limitations

Distill the messages from your data: what implications do your findings have for your audience? Taking into account the quality of the system model, how do sensitive data and uncertainties affect the reliability of these messages? Discussions with colleagues and stakeholders can help you draw the right conclusions.

How representative is your LCA: Is it a screening LCA with lots of secondary data,  or is your study verified and ready to change the corporate strategy? Tell them if they can confidently take action on the basis of your LCA, or if further research is needed.

10: Public environmental claims without LCA review

To make “public comparative assertions” – i.e. publically saying your product is “greener” than a competing product – your LCA must undergo “critical review”, a process defined in ISO14040.

Consequences of this mistake

If your LCA is not verified, you cannot be 100% sure it’s reliable. Undiscovered mistakes can lead to greenwashing allegations if your readers, instead of you, discover them. Furthermore, your LCA won’t be ISO-compliant.

How to prevent

If you want to make public environmental claims: Conduct the self-checks in this article to weed out mistakes in advance. Then, get verified!

For a phased approach:

  • After the Goal and Scope phase, check mistakes nr. 1, 2, and 3.
  • During Life-cycle inventory creation, pay attention to mistakes nr. 4, 6, and 7.
  • After the Life cycle inventory analysis, check mistakes nr. 5 and 8.
  • During Interpretation, check mistakes nr. 9 and 10.


Common LCA mistakes occur in choosing and applying methodology, and in the form of flawed primary and secondary data. The self-checks in this article go a long way to increase your confidence in the accuracy of your LCA results. Improving your LCA step-by-step, until it is fit for internal use, and later on, for verification/critical review. Throughout your LCA process, solid data documentation and brainstorming with colleagues help tremendously.

And don’t forget your interpretations! Mobius’ scenario testing supports you in conducting sensitivity analysis, showing you the effect of choosing different datasets or making different assumptions – Give it a try!

Author image Lena  Nickel
Lena Nickel

I'm Researcher & writer at Ecochain. During my studies in Global Sustainability Science, LCA really captured my interest. It continues to fascinate me in my current Master in Energy Science, where I also conduct LCAs myself. I love researching & writing (and learning more!) about these crucial topics now for Ecochain's Knowledge Blog.

All posts by Lena