Skip to content
This repository has been archived by the owner on Mar 28, 2024. It is now read-only.

[BUG-134167] Script Memory Cap Increase as a Premium Feature? - Thoughts from the Aug 22, 2017 Server User Group #2205

Closed
4 tasks
sl-service-account opened this issue Aug 22, 2017 · 2 comments

Comments

@sl-service-account
Copy link

sl-service-account commented Aug 22, 2017

How would you like the feature to work?

(Oz told me it was ok to make a new JIRA post about this)

Allow users to select a higher memory cap through a drop down on compile time. I suggest a 16/32/64 etc cap, with the highest option preferably not lower than 256k.

Being able to select a memory limit above 64k could be limited to premium users. I for one would finally get a premium account if this was added.

Why is this feature important to you? How would it benefit the community?

Issues this would solve in comparison to today's practice of using many small scripts instead of few larger ones:

  • Less overall memory use, since a script consumes memory by just having a state and an even listener.

  • Less script time usage due to less string parsing when passing data between scripts. Right now to communicate between scripts in a linkset your primary option is to use llMessageLinked. If you need to pass lists (which you usually do), you'll have to do string parsing which is much slower than using lists directly. Especially when you wind up with many scripts in a project. Then all scripts with a link_message handler have to check each message to see if it was targeted towards them, which costs additional script time.

  • Less asynchronous issues. One of the pains of LSL development is that there are no anonymous functions that can be used for callbacks, making development a pain if many scripts need to share up to date data. Not having to split scripts would remedy that.

    Potential issues this would create, and suggested fixes:

  • Issue: Feature is premium only. A premium user compiles a script using 256k and sends it to a non-premium user. What happens now?

  • Suggested Fix: The non-premium user can use the script as is, but won't be able to re-compile it without a premium account. If the script is moddable and they copy paste the code into a new script, it will be treated by their compile settings, potentially generating a stack overflow if it consumed too much memory. Likewise if the user's premium expires. They can still use the script but not compile it until they restore premium.

  • Issue: Beginner scripters setting memory cap to max, even if they don't need it.

  • Suggested Fix: Have the options in the drop down default to, and show "recommended" 64k. If changed to anything higher, add a small notice akin to "This is a high memory allocation. If you're not sure what this is, leave it at 64k (default)". Or do like you did for mesh import, having to complete a short questionnaire before utilizing the feature. There were complaints about nobody using llSetMemoryLimit, but most people don't know how that feature works or what it does. Putting a memory limit option directly in the compile window with a hover tooltip would help.

  • Issue: Legacy viewers!

  • Suggested Fix: If no memory limit selection is sent to the server on compile time on a mono script, it should compile with the default 64k**

  • Issue: Would this work for mono only?

  • Suggested Fix: Probably yeah. People today sometimes say they compile things as LSL2 to save memory. If they were able to limit mono to 16k or lower for their tiny scripts, there's no real need to compile as LSL2 other than for legacy purposes?

  • Issue: With 4 times more script memory, it would be 4 times easier for abusers to flood the region's allocated memory.

  • Suggested fix: Limit the feature to premium accounts. People will be less prone to abuse if there's a monetary investment in it.

  • Issue: What about limiting this to experiences?

  • Solution: Please no, if these improvements help reduce load on the region, it would be to everyone's benefit if projects could utilize this anywhere in the world.

  • Issue: A higher memory cap would slow region crossings.

  • Solution: Oz mentioned this. Currently scripters go around the memory limit by splitting their project into multiple scripts. Would a 256k script cause more region crossing delay than 4x 64k scripts? And if so, by how much?

Links

Related

Original Jira Fields
Field Value
Issue BUG-134167
Summary Script Memory Cap Increase as a Premium Feature? - Thoughts from the Aug 22, 2017 Server User Group
Type New Feature Request
Priority Unset
Status Closed
Resolution Unactionable
Reporter Jasdac Stockholm (jasdac.stockholm)
Created at 2017-08-22T20:26:20Z
Updated at 2017-09-06T18:06:10Z
{
  'Business Unit': ['Platform'],
  'Date of First Response': '2017-08-23T15:03:47.118-0500',
  'How would you like the feature to work?': "(Oz told me it was ok to make a new JIRA post about this)\r\n\r\nAllow users to select a higher memory cap through a drop down on compile time. I suggest a 16/32/64 etc cap, with the highest option preferably not lower than 256k.\r\n\r\nI'm ok with memory caps above 64k being limited to premium users, but not to experiences.",
  'ReOpened Count': 0.0,
  'Severity': 'Unset',
  'Target Viewer Version': 'viewer-development',
  'Why is this feature important to you? How would it benefit the community?': 'Issues this would solve in comparison to today\'s practice of using many small scripts instead of few larger ones:\r\n\r\n* Less memory use, since a script consumes memory by just having a state and an even listener.\r\n* Less script time usage due to less string parsing when passing data between scripts. Right now to communicate between scripts in a linkset your primary option is to use llMessageLinked. If you need to pass lists (which you usually do), you\'ll have to do string parsing which is much slower than using lists directly. Especially when you wind up with many scripts in a project. Then all scripts with a link_message handler have to check each message to see if it was targeted towards them, which costs additional script time.\r\n* Less asynchronous issues. One of the pains of LSL development is that there are no anonymous functions that can be used for callbacks, making development a pain if many scripts need to share up to date data. Not having to split scripts would remedy that.\r\n\r\nPotential issues this would create, and suggested fixes:\r\n\r\n* Issue: Feature is premium only. A premium user compiles a script using 256k and sends it to a non-premium user. What happens now?\r\n* Suggested Fix: The non-premium user can use the script as is, but won\'t be able to re-compile it without a premium account. If the script is moddable and they copy paste the code into a new script, it will be treated by their compile settings, potentially generating a stack overflow if it consumed too much memory. Likewise if the user\'s premium expires. They can still use the script but not compile it until they restore premium.\r\n\r\n* Issue: Beginner scripters setting memory cap to max, even if they don\'t need it.\r\n* Suggested Fix: Have the options in the drop down show "recommended" on 64k. If changed to anything higher, add a small notice akin to "This is a high memory allocation. If you\'re not sure what this is, leave it at 64k (default)". Or just do like you did for mesh import with having to complete a short questionnaire.\r\n\r\n* Issue: With 4 times more script memory, it would be 4 times easier for abusers to flood the region\'s allocated memory. \r\n* Suggested fix: Limit the feature to premium accounts. People will be less prone to abuse if there\'s a monetary investment in it.\r\n\r\n* Issue: What about limiting this to experiences?\r\n* Solution: Please no, there are so many non-experience project that would see benefits from this.',
}
@sl-service-account
Copy link
Author

Chaser Zaks commented at 2017-08-23T20:03:47Z

Issue: Beginner scripters setting memory cap to max, even if they don't need it.
Suggested Fix: Have the options in the drop down default to, and show "recommended" 64k. If changed to anything higher, add a small notice akin to "This is a high memory allocation. If you're not sure what this is, leave it at 64k (default)". Or do like you did for mesh import, having to complete a short questionnaire before utilizing the feature. There were complaints about nobody using llSetMemoryLimit, but most people don't know how that feature works or what it does. Putting a memory limit option directly in the compile window with a hover tooltip would help.
Hide it as an advanced options. :P
EG: Edit -> "Enable advanced features"
With a warning dialog like "CAUTION: Enabling advanced features may degrade the performance of your script and the sim. These are intended for advanced scripters who need additional functionality. If you are new to scripting, it is highly advisable that you keep this disabled for now. Click [here|insert link to wiki page here] to read more about this]"

Only people who know what they are doing would know how to enable this.

Issue: What about limiting this to experiences?
Solution: Please no, if these improvements help reduce load on the region, it would be to everyone's benefit if projects could utilize this anywhere in the world.
Although I am against the premium membership to write better performing scripts, it would be better than nothing.
A alternative solution would be something to limit a agent's script memory allocation. EG:

  • Estate owner/managers has infinite script memory allocation
  • Estate can set a limit to max allocated script memory for a agent, max allocated script memory per avatar for rezzed objects(not group owned), max allocated memory for parcels. With 0 being none(default as to not break existing content).
  • Groups can set "Unrestricted script memory allocation" for officers or owners.

If they go above this, one of two things can happen:
A) DEBUG_CHANNEL "Cannot start script [script_name] because it would go over the agent's allowed script memory usage"
B) Object would fail to rez with a similar message to the parcel full message.

Issue: A higher memory cap would slow region crossings.
Solution: Oz mentioned this. Currently scripters go around the memory limit by splitting their project into multiple scripts. Would a 256k script cause more region crossing delay than 4x 64k scripts? And if so, by how much?
I would think the contrary.
From my understanding, for every script a avatar has during agent transfer, there is a http request. So if someone makes multiple scripts for additional memory space, that means there are going to be more http requests.
It is much faster to send a single block of 256kb script memory over a single http request, than it is to make 4 requests for 64k blocks of script memory.

@sl-service-account
Copy link
Author

Kyle Linden commented at 2017-09-06T18:06:11Z

Hi Jasdac,

Thank you for your suggestion. The team has reviewed your request and determined that it is not something we can tackle at this time.

Please be assured that we truly appreciate the time you invested in creating this feature request, and have given it thoughtful consideration among our review team.

This wiki outlines some of the reasoning we use to determine which requests we can, or can't, take on: http://wiki.secondlife.com/wiki/Feature_Requests

Thanks again for your interest in improving Second Life.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant