Is it that you just may agree with so to add long data sources for GPT3 to reference?

I'm having a watch to diagram a sports actions factbot for something cherish baseball for occasion. Is there a formula I may feed it a tonne of personalized details, tables, figures which may be up to this point for it to reference and give responses per that data?

I do know one amongst the obstacles the GPT3 model has is that it’s finest trained in data up except some time in 2019, so having or not or not it’s ready to reference current data sources may be crucial for this expend case.

Taking a watch via the documentation, I will be capable to't gain a formula to assemble it that wouldn't do away with thousands of tokens for every effect a matter to (and even real hit the token cap). Does any person know of a cleaner design of doing it?

Cheers for any advice!