The screen size won't be anywhere near the same. 120" is roughly 6000 square inches. 85" is roughly 3000 square inches. Imagine moving from your current living room to one that's half the floor area. Would you really think it's "nearly the same"?
If you want to check the maths yourself then here are the numbers:
120" diag 16:9 = 105" x 59" (w x h)
85" diag 16:9 = 74" x 42" (w x h)
Two points to note: Projector screen sizes are often quoted as a nominal figure , so you could well find that the diagonal is 115" / 116" / 117" when measured for real.
All figures are rounded to the nearest inch.
Making comparisons between a projector and a TV will obviously be affected by the quality of each component. The following comments then presume a decent projector from Optoma / Epson / JVC / Sony / Sim2 versus a good LCD-LED TV from Sony / Panasonic or Samsung.
In general, projectors have better scaling engines than TVs with SD and HD material. The reason is to do with the different applications of each tech.
A TV is for general viewing where the relative screen size versus viewing distance means that the screen fills only a small portion of our field of view.
Where a projector is installed in the same room, we tend not to change the seating position when viewing a projected image. Also, it's usual to go for a much bigger image than a TV can produce. This means the projected image fills more of our eyeline, so any shortcomings in scaling will be more evident than with a TV. Therefore a projector needs a better scaling engine if the massively larger image is to stand up in comparison against the smaller TV image where the defects are more easily hidden.
Clearly though, TVs have some advantages: 4K is standard display res on TV sets now. Higher-end LED TVs will make a brighter image. The colour saturation doesn't depend so much on the ambient light, so the pictures can look more colourful (but not necessarily natural).
Set against that, the cost of TVs rises exponentially above the 'core' size of 65" due to manufacturing costs for smaller volumes.
Once you get in to looking at TV models you're going to find that there's a lot more to buying a screen than brand, size, price and the apps it features.
You'll need to get familiar with the crucial difference between the motion processing rate (a manufactured number and largely meaningless) and the far more important native panel refresh rate which will be either 50/60Hz (poor) or (better) 100/120Hz.
You'll also need to pay attention to whether the TV has the ability to dim the image depending on what's happening with the picture. Edge-lit panels can dim the entire screen as one. This is fine if the entire image is light or dark, but not so useful for images with a combination of both light and dark such as say people sat round a camp fire at night.
Another type of back-light has the lighting directly behind the panel, but lacks any dimming ability at all. The advantage it offers is more even illumination (its less patchy in dark scenes compared to edge lit), but it can't react to picture content.
The best is something called FALD. This stands for Full Array Local Dimming. Here the lights are behind the panel and split in to zones. Each zone can be dimmed independently of its neighbours. It won't come as a surprise that the more you spend on a set with FALD then the greater the number of zones and the better the set handles mixed lighting screens.
Next, there's the panel's bit depth. This determines how many colours the panel can actually display. What's really important here is how the panel handles subtle colour gradations at the darker end of the light spectrum.
Think about the graduated brightness of an evening sky. Where the TV has just an 8bit display you will see colour bands because there's not enough colour resolution in the display to render a smooth image. The best TVs have a true 10bit panel, but it comes at a price. They aren't cheap. Everything in between uses a half-way house technique called dithering.
Dithering is where a TV can't reproduce a specific colour shade, so the panel quickly alternates the pixels between two colours it can reproduce directly. Done fast enough the two colours appear to merge and become the colour in between. Panels that use this technique will be referred to as "10bit (8 +2FRC)". You’ll still get some colour banding but it won't be as pronounced as a direct 8bit panel displaying full colour range 4K images.
If that all wasn't enough, there's the question of which HDR formats the TV will support.
The broadcast TV HDR format known as Hybrid Log Gamma or HLG for short. This is used by Sky on its premium UHD sports channels and UHD downloads. Virgin only has this on its BT Sports Premium channel. Freeview doesn't yet have any live UHD broadcasts, so any support tends to be focussed on UHD downloads.
When watching streaming (Amazon Prime, Netflix Premium, Disney+, and Apple's premium streaming service) you will encounter 3 systems. HDR10, HDR10+, and DolbyVision.
HDR10 is the core (fall back / basic) HDR system. It's decent, and a noticeable improvement over stuff without HDR, but its limitation is that it can't adapt to make the most of the dynamic range as picture content changes.
Both HDR10+ and DolbyVision take the basic HDR10 format and add the ability to change dynamically. The good news is that any program recorded with either HDR10+ or DolbyVision shown on a TV that doesn't support that format will use the fallback of core HDR10.
Currently, IIRC, only Panasonic supports both HDR10+ and DolbyVision in a single TV. It's politics and cost. HDR10+ is licence-free but is a Samsung originated format, so some brands reject it on principle. DolbyVision requires a licence fee payment from the TV manufacturer, and so it makes the tellies a bit more expensive and hence a little less competitive in the cut-throat TV world.
Whilst on the subject of HDR, there's also the question of how bright the TV can get when displaying HDR images. This is a big problem for cheaper/smaller UHD 4K TVs. There aren't enough backlights fitted to make the image the sort of brightness it needs to be for HDR to impress. What these lower-grade sets have then is either HDR that looks very much the same as SDR, or the LEDs that are there are overdriven which results in premature failure. Cheaper LG sets and the odd Samsung fitted with an LG screen (yes, that happens) have this problem. It should be less of an issue with TVs at £2000-£3000, but it's still worth checking the light power figure (nits). If you're seeing 400-600 nits then that's not really good enough. 800-1000 is about where you need to be.
Pulling most of this together, have a look at the Sony KD85XH9505 at £2999. It has a 10bit panel, 100Hz native refresh rate, support for HLG, HDR10 and DolbyVision and HDR brightness of around 900-1000 nits.