Use fetch to wrap the export stream and join on the id to add the large
field without the docvalue.

Mathew

On Fri, Jan 23, 2026, 7:50 AM Ufuk YILMAZ via users <[email protected]>
wrote:

> I am a fan of the export handler so I try to have docValues in all my
> fields to be able to export them, if it's an analyzed field, I just make
> a companion copyField that stores string and has docValues.
>
> Today I came across a situation that I couldn't solve. I have a field
> that stores a very long string. If I simply configure it with:
>
> `indexed="false" stored="false" docValues="true"
> useDocValuesAsStored="true"`
>
> when indexing it throws an error saying something like:
>
> `DocValuesField "data" is too large, must be <= 32766`
>
> then I thought of storing it as binary:
>
> `<field name="data" type="binary" indexed="false" stored="false"
> docValues="true" useDocValuesAsStored="true"/>`
>
> I was able to index the large field this way. However, when I called the
> export handler, it threw an error saying "field type must be one of
> these: int, long, string, SortableText..." which doesn't include
> "binary"
>
> Then I tried making a stored string (type=string, stored=true,
> docValues=false) which worked for indexing, but not for the export
> handler due to lacking docValues (AI hallucinated that new Solr versions
> support exporting stored fields...)
>
> So, is there a way to have a large field but also have the capability of
> using the export handler?
>
>
> --ufuk
>

Reply via email to