0

I have an ASP.NET Core Razor Pages website where most pages are purely server-rendered, and they’re all indexed by Google just fine. However, I have three pages that contain Blazor WebAssembly components, and these pages are crawled but not indexed by Google Search Console (Crawled - currently not indexed). And when I try to "REQUEST INDIXING" through GSC. I get the following error: "Oops! Something went wrong We had a problem submitting your indexing request. Please try again later."

Here’s an example page that throws an error when trying to request indxing:

@page "/exercises/left/{id:int?}"
@using Keyboardy.Backend.Client.Services
@model Keyboardy.Backend.Pages.Exercises.LeftModel


<section class="section">
    <h1> This is importent information </h1>
    <p> This is also important</p>

    <div id="interactive-section">
        <component type="typeof(Core.Blazor.KeyboardContainer)" render-mode="WebAssembly" />
    </div>
</section>

@section Scripts {
    <script src="~/_framework/blazor.webassembly.js"></script>
}

I’m aware that Google doesn’t execute or index client-side Blazor (WebAssembly) content, but in this case, I'm concerned with the HTML that is already rendered on the server, not client-side Blazor.

What I’ve tried:

  • Other purely Razor Pages are indexed correctly.
  • Tried delaying Blazor boot script (setTimeout before loading blazor.webassembly.js) and requested indexing, but I still got the "Oops! Something went wrong" error.
  • Removed the line: <component type="typeof(Core.Blazor.KeyboardContainer)" render-mode="WebAssembly" /> And the problem presisted. So I guess the error is occurring because I am loading a ton of wasm files through this line <script src="~/_framework/blazor.webassembly.js"></script>

What can I do to index pages that are server-side rendered HTML and contain Blazor client-side?

2
  • Note: Although the page contains dynamically generated content through Blazor, I am only interested in indexing the static HTML content. Commented Nov 13 at 9:00
  • github.com/dotnet/aspnetcore/discussions/64346 Maybe someone has an answer there. Commented Nov 22 at 22:19

1 Answer 1

0

Here is what I end up doing:
Conditionally Load Blazor for Bots. Detect crawlers and skip loading the Blazor script entirely for them.
I don't believe this is the best way to use Blazor with SEO indexed pages, but it works.

@page "/exercises/left/{id:int?}"
@using Keyboardy.Backend.Client.Services
@model Keyboardy.Backend.Pages.Exercises.LeftModel
@{
    var userAgent = Request.Headers["User-Agent"].ToString().ToLower();
    var from = Request.Headers["From"].ToString().ToLower();
    var isGoogleBot =  
    userAgent.Contains("googlebot") || userAgent.Contains("crawler") || from.Contains("googlebot");

}

<section class="section">
    <h1> This is importent information </h1>
    <p> This is also important</p>

    <div id="interactive-section">
        <component type="typeof(Core.Blazor.KeyboardContainer)" render-mode="WebAssemblyPrerendered" />
    </div>
</section>

@section Scripts {
    @if (!isGoogleBot)
    {
        <script src="~/_framework/blazor.webassembly.js"></script>
    }
}

You can check the page by yourself: https://keyboardy.net/Exercises/Left/1.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.