Skip to content
This repository has been archived by the owner on Mar 28, 2024. It is now read-only.

[BUG-216032] Changes to PRIM_MEDIA_CURRENT_URL breaks lots of existing content #3518

Open
3 tasks
sl-service-account opened this issue Apr 3, 2018 · 26 comments

Comments

@sl-service-account
Copy link

sl-service-account commented Apr 3, 2018

Steps to Reproduce

Here is an example script:


default
{
    state_entry()
    {
        llSetLinkMedia(LINK_THIS, 0, [PRIM_MEDIA_CURRENT_URL, "panda.place"]);
        llOwnerSay(llList2Json(JSON_ARRAY, llGetLinkMedia(LINK_THIS, 0, [PRIM_MEDIA_CURRENT_URL])));
    }
}

Output: [""]

Actual Behavior

I do not know what server update changed it, because it is not anywhere in the notes. But some time very recently, maybe last week, there was a limitation added to PRIM_MEDIA_CURRENT_URL where it now requires http:// or https:// etc to be updated. This breaks lots of existing content that leave out the schema, such as my own domain panda.place. This does not affect PRIM_MEDIA_HOME_URL, only PRIM_MEDIA_CURRENT_URL.

This issue was probably added in second life server 18.03.27

Expected Behavior

Output should have been ["panda.place"]

I would prefer if it worked just like before this update.

Links

Duplicates

Related

Original Jira Fields
Field Value
Issue BUG-216032
Summary Changes to PRIM_MEDIA_CURRENT_URL breaks lots of existing content
Type Bug
Priority Unset
Status Accepted
Resolution Accepted
Reporter Tonaie (tonaie)
Created at 2018-04-03T21:03:11Z
Updated at 2019-07-24T13:31:56Z
{
  'Business Unit': ['Platform'],
  'Date of First Response': '2018-04-03T16:28:46.287-0500',
  'ReOpened Count': 0.0,
  'Severity': 'Unset',
  'System': 'SL Simulator',
  'Target Viewer Version': 'viewer-development',
  'What just happened?': 'I do not know what server update changed it, because it is not anywhere in the notes. But some time very recently, maybe last week, there was a limitation added to PRIM_MEDIA_CURRENT_URL where it now requires http:// or https:// etc to be updated. This breaks lots of existing content that leave out the schema, such as my own domain panda.place. This does not affect PRIM_MEDIA_HOME_URL, only PRIM_MEDIA_CURRENT_URL.\r\n\r\nThis issue was probably added in second life server 18.03.27',
  'What were you doing when it happened?': 'Here is an example script:\r\n\r\n<pre>\r\ndefault\r\n{\r\n    state_entry()\r\n    {\r\n        llSetLinkMedia(LINK_THIS, 0, [PRIM_MEDIA_CURRENT_URL, "panda.place"]);\r\n        llOwnerSay(llList2Json(JSON_ARRAY, llGetLinkMedia(LINK_THIS, 0, [PRIM_MEDIA_CURRENT_URL])));\r\n    }\r\n}\r\n</pre>\r\n\r\nOutput: `[""]`',
  'What were you expecting to happen instead?': 'Output should have been `["panda.place"]`',
}
@sl-service-account
Copy link
Author

Soft Linden commented at 2018-04-03T21:28:46Z, updated at 2018-04-03T21:29:45Z

Would your content work again if an "http://" prefix were added to the script-supplied URL when no scheme has been specified?

@sl-service-account
Copy link
Author

Tonaie commented at 2018-04-03T21:41:26Z, updated at 2018-04-03T21:42:09Z

Possibly. I would prefer if it remained agnostic since the field is super handy to permanently store some custom data on a prim. Could you at least make it so llGetLinkMedia returns the current string for existing content? If not, maybe you could roll back the change and give developers a month to update their code like when you did the HTTP request update?

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-04-04T04:23:24Z

It's very problematic that the existing data in the field cannot be accessed now, so there is no possible migration path with this change currently deployed.

If the sever adds the prefix on scripted GET, this may still cause issues with existing content. The field may be set with data equal to the field's max length, so a server added prefixed string would only reliably work if the string is not truncated.

Likewise with scripted SET, the scripts would need to be updated to deal with a new data limit of 7 less characters.

The content could be updated to work within the new limitation if its actually possible to read the fields again.

@sl-service-account
Copy link
Author

Tonaie commented at 2018-04-10T21:48:59Z

It has been a week and many products are still broken. Any feedback?

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-04-17T02:12:47Z

Can we at least get some means to read the old field back? Right now we can't even migrate to either work with this new limitation or move to another data storage method (likely additional scripts and external web DB, Experience Persistent Storage has too many limitations to be an option).

Right now our only options are:

  1. Recreate everything by hand and start over, 1000's of hours of work over the last couple years
  2. Cry

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-04-22T10:43:38Z

We're getting bugged daily about this, could we get an ETA when there might be an update so we have something other than "still nothing, waiting on LL" to say? :(

@sl-service-account
Copy link
Author

Whirly Fizzle commented at 2018-04-22T20:53:26Z

Another complaint filed at BUG-216152

@sl-service-account
Copy link
Author

Oz Linden commented at 2018-04-24T19:46:17Z

We installed an important security patch, which happened to no longer infer http for a URL that had no scheme (in my opinion a good change). That broke some scripts that had used the field for a URL but left off the scheme (eg. "myserver.example.com/myapi") so those were broken. In order to get those slightly sloppy but legit uses working again, we added the scheme for them (but used https because everyone always should).

Storing data in that url field was never intended to work.

The fact that we broke uses that were never legit is unfortunate, but not something I feel an obligation to maintain compatibility with. We'll try to help you recover data that's trapped there, but we won't change it so you can keep doing that. If you have scripts that you need to get the data back out of, let us know and we'll try to work with you.

@sl-service-account
Copy link
Author

Tonaie commented at 2018-04-24T21:28:39Z

I have a few hundred prim media entries that are affected. Adding https:// is fine, but prim media faces that used all 1024 bytes still returns "". Could you simply shift off the last 8 bytes in these cases instead of returning nothing? These characters can usually be guessed, and allow us to recover the data ourselves.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-04-24T22:57:05Z, updated at 2018-04-24T23:10:47Z

Thanks Oz.

What Tonaie said. Still can't read most of the faces.

In my case I had used it for extended shared ram for dynamically generated content (been working on a pure LSL dungeon generator for the last few years when I have time). The each script would only have to work on <2KB of data at a time and could thus contain more code, resulting in far fewer total scripts.

Edit, to make it clear, if the field without scheme is longer than 1016 bytes still return "".

@sl-service-account
Copy link
Author

Tonaie commented at 2018-05-03T10:56:57Z

There may also be legitimate uses with long URLs broken by the 1016 bytes issue.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-05-12T23:03:35Z

I've rezed boxes containing the 107 effected prims (963 faces, 1926 "fields"...) at Testylvania. I also sent a notecard with the same objects in it to Oz in an IM. Let me know if there is anything else you need me to do. Thanks!

http://maps.secondlife.com/secondlife/Testylvania%20Sandbox/114/35/22

@sl-service-account
Copy link
Author

Mazidox Linden commented at 2018-05-15T18:55:55Z, updated at 2018-05-15T20:39:21Z

Hi there,

The regions Debug1, Debug2, Testylvania Sandbox, and Unpackistan are temporarily running an older build that should allow you to unpack your media URL field into something that will work going forward.

Edit: Debug1 and Debug2 should be available to everyone now!

@sl-service-account
Copy link
Author

Tonaie commented at 2018-05-15T18:59:52Z

Thank you, we are probably going to need at least a week due to the large amount of data that needs to be extracted.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-05-15T22:33:35Z, updated at 2018-05-15T22:43:59Z

Thanks but I'm still getting "" on the media URL field while in Debug1 and Testylvania Sandbox.

Edit: Has the roll not happened yet? Or maybe not rolled back far enough?

@sl-service-account
Copy link
Author

Tonaie commented at 2018-05-15T22:42:33Z

The info box says the server is Second Life RC Cruller 18.03.29.513939, which I think is the patch that broke it in the first place. The behavior is consistent with that patch of PRIM_MEDIA_HOME_URL being untouched, but PRIM_MEDIA_CURRENT_URL not returning anything without scheme.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-05-15T22:54:30Z

I've rezed two test objects at http://maps.secondlife.com/secondlife/Testylvania%20Sandbox/114/35/22 so you guys can test directly if needed.

On touch, they will say in to local the contents of PRIM_MEDIA_CURRENT_URL on the first 3 faces.

Expected behavior: Yellow one (WD0) should return 3 lists, each with a string of something. Blue prim (DB0) should return 1 list with 1 string of something and 2 empty lists.

Actual behavior: Yellow returns 3 lists, each with an empty string. Blue returns 1 list with an empty string and 2 empty lists.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-05-16T04:46:11Z

For completeness, I also tested on Debug2, same issue. I don't have access to Unpackistan.

@sl-service-account
Copy link
Author

Mazidox Linden commented at 2018-05-16T19:07:26Z

Tonaie and Kadah, are you able to unpack your existing information now? The right version should be on the aforementioned regions now.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-05-16T22:22:40Z, updated at 2018-05-16T22:31:18Z

Mazidox, no change in Debug1 or Testylvania Sandbox. Even recompiled scripts just in case that mattered.

Edit: Toonie says its working for them though, so I'm not sure why I can't get it working myself. Will keep poking at it.

@sl-service-account
Copy link
Author

Tonaie commented at 2018-05-16T22:54:04Z

I have managed to successfully extract data but it is not going very fast because of the large linksets I have to rez to do this, and then I have to test them in another sim. After an evening of work I have extracted around 8% of the total data, and I still need to verify that everything is correct. So I hope you are willing to keep the sim open for some time.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-05-16T22:56:13Z

Something odd is happening. Some objects will return data for PRIM_MEDIA_CURRENT_URL, others still return "", while some are adding "http://" to PRIM_MEDIA_HOME_URL and PRIM_MEDIA_CURRENT_URL.

It might be that objects that were rezed post-18.03.14.513292 have had their media fields altered and will give what I'm seeing on 18.03.14.513292. I will have to do more testing, I have copies of of these prims in inventory that have not been rezed post-18.03.14.513292.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-05-17T05:26:47Z

Figured it out. Anything rezed on or after 18.03.27.513831 have been altered by the region and will exhibit the behavior and limitations from that version even while on a 8.03.14.513292 regions. eg. "https://" prefixes and empty string returns on anything that was longer than 1016 characters.

Luckily we have copies of everything in inventory from before the change, so we should be able to get everything we need. Unfortunately this means all the prims we pulled over the weekend are useless and we'll need to redo that part again on a 8.03.14.513292 region.

This will take a while. Will post back when we're finished. Thanks!

@sl-service-account
Copy link
Author

Mazidox Linden commented at 2018-06-04T20:56:57Z

Hi there everyone,

The current plan is to decommission RC Cruller this week (specifically Wednesday, when we roll RC BlueSteel, RC LeTigre, and RC Magnum). If you are absolutely unable to complete your unpacking by then, please reply with how much longer you need.

@sl-service-account
Copy link
Author

Tonaie commented at 2018-06-04T20:59:03Z

We are done but comments are locked I think in case somebody else needs to restore their data.

@sl-service-account
Copy link
Author

Kadah Coba commented at 2018-06-05T02:28:56Z

Usually people that can't comment on Jiras like this contact me directly about the thing. I haven't had anyone else say anything about this, so I'm assuming nobody else needs recovery.

If there is anyone else, they can poke me in-world about it if they can't comment on this Jira.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant