SpenceKonde/DxCore

Question about changes to core files (Information Request)

mikrocoder opened this issue · 31 comments

Hello,

I have to ask you once about your
\packages\DxCore\hardware\megaavr\1.5.8\cores\dxcore
files. You have made changes to new, new.cpp and new.h. Did you make changes to any other files?
Was that because of the "unused warnings" or were there other reasons?
Compared to the Arduino orignal core files or MegaCoreX files?
Reason for my question is. I want to get the LibStdCpp running with your DxCore.
I think I need to undo some changes.
https://github.com/modm-io/avr-libstdcpp
With original Arduino (Mega2560) and MegaCoreX (Nano Every) the first attempts work.

Made changes to? Since when? I believe every single file in the core has been modified from the stock version and from MegaCoreX. mTC and DxC however use nearly the same code, to the extent practical.

Most of those changes went in with 837df06 - they were made in direct response to a user complaint about missing new methods when compiling to newer versions of cpp standard, there were never any warnings generated from that file.

Oh, and 1.5.8 is yesterdays breakfast, we're on 1.5.10 now (1.5.9 was no good) - 1.5.10 is largely a bugfix release, but some of the bugs are not trivial ones, so 1.5.10 is recommended.

Hello,

I didn't mean the DxCore files you need for your controllers. I meant the avr-gcc files or the avr-gcc toolchains. "new" and all others belong to the toolchain. They have nothing to do directly with your core package.

Yes I know that you and MCUdude had to make certain changes to the toolchain to support your controllers. But also these are all changes that basically have nothing to do with the (Arduino) standard libraries.

I can't say exactly what the problem is yet. I currently only know that I have problems with the DxCore package and not with all others.

An example. I include
#include <array>
create one
std::array<int, 3> ar1 {33, 21, 17};
and output that. Works.
Would like to reverse the order and use
std::reverse(ar1.begin(), ar1.end());
I get the following message:
error: 'reverse' is not a member of 'std'.

If I include vector without using it, I get the following message
error: 'nothrow' is not a member of 'std'.
note: 'std::nothrow' is defined in header '<new>'; did you forget to '#include <new>'?

This can currently be due to everything and nothing.

Why this doesn't work with the DxCore package I don't know. I am still thinking about it. So I wanted to know what other changes you made to the toolchain files. A change back to the MCUdude version of the 3 files new, new.cpp and new.h didn't need any improvement. There must be more. But I can't expect you to tear everything apart now.

DxCore version 1.5.10 is installed.

"new" is part of the DxCore files too....Look in the dxcore folder.

Hello,

if I exchange in the dxcore path
\packages\DxCore\hardware\megaavr\1.5.8\cores\dxcore
the 3 files new, new.h and new.cpp against the ones from MegaCoreX, std::reverse works.
But include vector does not work for example.
My Toolchain is avr-gcc 13.2.0.

\packages\DxCore\hardware\megaavr\1.5.8\cores\dxcore

Again, why are you trying to use 1.5.8? It's buggy (just as previous releases were, especially with Optiboot) and has been superceeded by 1.5.10. As a sanity check, update to 1.5.10 and then re-run your tests to make sure it wasn't something that got fixed along with the slew of bugfixes over the past several months.

Back more on the topic of New...

When New was amended last, it was part of the changes to enable uh, was it -std=gnu++17 ? I think so.

I can't imagine why you'd get different behavior on my core with his version of new and on his core with his version of new are you saying that's what you're seeing?

I have always been unclear on why new had to be provided by the core and wasn't part of the toolchain package....

I had written that I have installed 1.5.10. The path with "1.5.8" was only copy paste. The version 1.5.10 has not changed anything in the avr- gcc toolchain. Therefore nothing changes with it.

Actually I need a DxCore package with unchanged Arduino files / substructure. Is surely too much to ask ;-) Where only the necessary files are added and changed which are necessary for the actual DxCore. So as that is in the MegaCoreX package for example. I currently suspect that somewhere other path details were changed which have changed the original structure of the toolchain. What currently works.

Why avr-gcc 13? Because I want to learn about C++20 and C++23 features. Otherwise it makes no sense. Otherwise LibStdCpp wouldn't make sense either.

Actually I need a DxCore package with unchanged Arduino files / substructure. Is surely too much to ask ;-) Where only the necessary files are added and changed which are necessary for the actual DxCore. So as that is in the MegaCoreX package for example. I currently suspect that somewhere other path details were changed which have changed the original structure of the toolchain.

I have NO CLUE AT ALL what you want here. What does that mean? What "unchanged arduino file structure"?

There is no part of "Arduino" that is included with my core, and the download from the repo (see the manual installation notes for the extra steps needed https://github.com/SpenceKonde/DxCore/blob/master/Installation.md )

If you take a plot of land, build a house on it that presents a similar API to your old house - it has doors with knobs to open them windows that slide up, and a kitchen to make food in. The windows and doors are in different places, the house is a different shape and the appliances in the kitchen are totally different .... Then the building inspector shows up, and demands to be shown the unmodified parts of your old house in your new one. You are baffled

And that's how I feel on this question.

What kind of conversation is this?
You must also want to understand me.
You did not write the files new, new.cpp and new.h yourself. You took the "original" files from Arduino and changed them. You confirmed this yourself above. That's what I mean with original files before your later change for someone. The question was if you still have the files before the change for someone.

Your explanations for the avr-gcc versions are not correct either. You have an aversion to new versions for some reason. I don't care. But that doesn't mean you have to badmouth new avr-gcc versions. If you want to use the optimal compiler for AVR, you have to use avr-gcc 8.

WestfW commented

I didn't mean the DxCore files you need for your controllers. I meant the avr-gcc files or the avr-gcc toolchains. "new" and all others belong to the toolchain. They have nothing to do directly with your core package.

I don't understand, either. avr-gcc does not include a c++ library or include files, and "new" in the Arduino core is provided by the Arduino core and not by avr-gcc. It's pretty impossible to separate the cores/* files into ""files needed for particular controllers" and "core compiler functionality" (although I guess "arduino-api" tries to do some of that, for newer chips.) new in the current Arduino arduinoCore-avr was added, and swapped and fiddled with after the "branch" that led to DxCore (via megaCoreX?) So it's not so much explicit changes to the "originals" as separate "drift" (heh. In DxCore, "new" just does a #include <new.h>, but in the Arduino core, new.h does #include <new> MegaCoreX doesn't have new (the file) yet.)

It looks to me that the libstdcpp library you are trying to use will therefore have conflicting new/new.h/new.cpp implementations. I don't' know whether it does special things to include its own versions of new, or whether its targeted to older cores, or what. In general, it looks "less supported" than either Arduino or Spence's cores.

You have an aversion to new versions for some reason.

Just because they're not supported or provided by the chip vendor, or Arduino, or any other vendor with "skin in the game", and because discussion on the official forum (avr-freaks) says that they produce worse code.

If you want to use the optimal compiler for AVR, you have to use avr-gcc 8.

Citation? Is there some particular change you were looking for? I know there was some increased optimization of ISR prefix code that some people were excited about, but other weaknesses seem to have prevented wide deployment.
(Oh. I see. The libstdcpp library you're trying to use requires a newer compiler. But none of Atmel, Microchip, Arduino, MegaCoreX, or DxCore supports the gcc versions required by the cpp library.)

Pretty much what he said - though I have a somewhat different opinion on the divison between core and toolchain files.

IMO there is a very clear boundary. The files in the toolchain package (currently Azduino7b1 on latet release) belong to the toolchain. new is not one of them. The files that belong to the hardware package are the ones that are in this repo. I never modify files in ways specific to the core. Other than updating the headers and precompiled libraries and device specs to the latest versions derived from the ATPACKs, only 2 additional files are changed, eeprom.h and power.h; those two ones had gotten overwritten with the older version during one of the toolchain updates. I belive the changes were actually initially from microchip, and just don;'t get pulled in with the ATpacks - I think they're in the newer version of avrlibc, which Microchip has thus far managed to either skate by without releasing source code for (so we can't build it for arduino, since you can't distribute any toolchain unless you can distribute it for all platforms.

New was changed in several steps from the one that we had when mTC and DxC were forked, I think the most recent was because someone was pushing for C++17 support, and that meant making work, and doing the

The degradation in quantitative measures of binary quality is definitely noticeable. The thing that makes 7 desirable is that it's the first usable major version (something was wrong bad when arduino tried to use 6,x), and the reason they did that was for LTO, which as I understand arrived for gcc in general at the same time (ie, it wasn't AVR specific), and - unlike the other optimizations that get added, this one makes a huge stonkin difference. But that's the only significant improvement that's been made as far as code quality for AVR, otherwise it's been slowly degrading since 4.3. And of course you would expect this, because what version is the last one that has clearly had the benefit of a person working for money maintaining it appears to have been - you guessed it - 4.3.

If Microchip is still paying someone a proper salary for that sort of work on gcc, I don't think they're getting their money's worth. But I suspect they're only paying their compiler people to work on their proprietary compiler. (I'm of the opinion that hiding the source for your compiler just makes you look suspicious, engenders ill will, reduces security (as it can't be audited), and I really wonder why the hell they insist on doing it. Does it really bring in that much revenue compared to what they make selling silicon? I think that they'd do better by letting their best compiler loose for all, since what, "Well, I wasn't planning to invest a bunch of money copying Microchip;s design, but now that the source is available, I think I will buy that semiconductor design company after all"? Releasing their best compiler would mean more people using their tools, which one assumes produce better code (and I suspect it actually does, Cause if it didn;t, they would have to pay customers to use their compiler, but they've got it gong the other way), and since it only would work for their parts, it would make their hardware that much more desirable....

Is it really so hard to read the few posts? I have written that I have newer toolchains. This works together with the LibstdCpp and other cores. I want to find out why this does not compile with the DxCore.

Spencekonde judges the compiler quality unfortunately only on code size. Unfortunately not on error correction. But bug fixing is more important than code size.

You are welcome to try it out. avr-gcc 8.5

I do not fight against your other opinions. That is much too tedious for me together with the back and forth overwriting of the texts. Because that also has nothing, absolutely nothing, to do with my question. The thread has totally drifted. If it works with other cores and not with the DxCore, there must be something different about the DxCore. That's what my question is about. Can we stay on topic?

WestfW commented

If it works with other cores and not with the DxCore, there must be something different about the DxCore

Perhaps the newer toolchain has bugs specific to support of the new AVRs. PROGMEM support (including whatever is used internally by g++) may be different with the possibility of ram-mapped flash, for example.

Now it gets weird. Is this conjecture or is this knowledge? Also, this is not your problem. That's not my question.
Can anyone add anything to my actual question?

WestfW commented

How about

  1. Arduino does not support avr-gcc beyond v7.3
  2. Arduino does not support LibStdCpp. Not someone's special version of LibStdCpp, either.
  3. https://github.com/modm-io/avr-libstdcpp does not claim to explicitly support Arduino.
  4. For AVR Arduino, "new" and "delete" are part of the "Arduino core", NOT provided by the compiler infrastructure.
  5. Almost all of the "Arduino Core" has been modified to support the Dx AVRs, so it is difficult and/or meaningless to try to provide a list of changes made in that part of the software,
  6. The only changes to the toolchain are to provide include files and libraries for the Dx chips. AFAIK, that's still standard avr-libc stuff, and the differences from the Microchip-provided toolchain consist of avoiding the need for "Packs."

It's vaguely unfortunate that DxCore isn't a "fork" of ArduinoCore-AVR (perhaps via MegaCoreX), so that diffs/changes can be easily tracked all the way back to a common point. Or "synced" to (maybe) pickup Arduino bugfixes. But forks in github have other issues...

I guess I'm sorry that your unsupported library for an unsupported feature set that works on an unsupported compiler does not work in DxCore. But "additional c++ features" is not a goal of the DxCore project, and poor Spence has plenty to do just chasing the churn in Microchip's chip output.
If you can narrow it down more than "maybe a change you made caused this to break", perhaps "we" can be more helpful.
Have you checked with the modm-io/avr-libstdcpp people? Do your examples fail in the same way with MegaCoreX?
the "new" implementations seem to have drifted apart pretty significantly, but I dunno if they're related to the problems you're seeing, or whether you just picked that as an example of code that has changed.

WestfW commented

Ok, I m vaguely curious.
What steps do you use to replace gcc and install libstdcpp for Arduino? Despite some superficial resemblance (presence of src and examples sub-directories), it doesn't seem to be a "genuine" "Arduino library", and the whole "many include files" seems incompatible with it working as one in general. So I guess it gets installed in the avr-gcc tree?

I will note that the problem is not, avr-gcc 8. I know of people who run on avr-gcc 10, 11, 12. The code they get out isn't any better, sometimes it's worse, but it is expected to work, though not "supported". The problem is that libstdc that you're trying to use isn't getting along with something, and you have reason to belive it's connected to the news. And indeed, the news did change several times post-fork. You may actually find that you needf the history for those files on both repos to find the issues that started it all, because when it twas reported in one repo, it was always ported to the other. You can see the history of that file in github. That's all I know about it. I'd need to refer to that. and I'm not going to read it to you. There was almost nothing to any of the new* files when the core started. It was only when people started wanting to bump up the c++ std version to gnu++17. I don't have a crisp understanding of what new contains, or why new isn't part of the toolchain and instead belongs to libraries of various descriptions.
I mean, it is interesting that it does work on other cores and not this one, and if you end up figuring out what exactly the issue is, that might warrant investigation. I lack th c++ background to be able to say with confidence whether that issue is expected behavior because you're doing something we exp[ect will fail. You know far more about C++ than I do, clearly, and you seem to have narrowed it down to 3 files, all three of which the entirety of my knowledge about can be found in the github history of that file and the associated issues.

Hi,

@ WestfW
with the MegaCoreX and others it works. I had already written several times ;-)

The people from modm-io/avr-libstdcpp don't care about Arduino. Because Arduino is not "The Standard". But that doesn't mean it doesn't work as found with other cores. The adaptations of the Arduino IDE to the DxCore should have nothing to do with any files of the toolchain regarding libs. The uC support of the actual compiler has nothing to do with any additional files like 'new' or others. The changes for uC support are before that. That's why I'm even more surprised about the problem. Entries of #include in certain files of the LibStdCpp do not solve the problem. If I have more insight and get stuck I will ask the modm-io/avr-libstdcpp people. But without further knowledge about it, it doesn't make sense yet and it all takes a lot of time.

How I do it ...

Unpack a toolchain.
Unpack LibStdCpp and move some files.
customized package avrLibStdCpp_230924.zip
Create and modify platform.local.txt.
Example:

#compiler.path=C:\avrToolchain\avr-gcc-8.5.0_mingw32_binutils2.40/bin/
#compiler.cpp.flags=-c -g -Os {compiler.warning_flags} -std=gnu++17 -fno-exceptions -fpermissive -ffunction-sections -fdata-sections -fno-threadsafe-statics -Wno-error=narrowing -MMD -flto
compiler.path=C:\avrToolchain\avr-gcc-13.2.0_mingw32_binutils2.41/bin/
compiler.cpp.flags=-c -g -Os {compiler.warning_flags} -std=gnu++20 -fno-exceptions -fpermissive -ffunction-sections -fdata-sections -fno-threadsafe-statics -Wno-error=narrowing -MMD -flto -Wno-volatile -Wl,-u,vfprintf -lprintf_flt -lm -fwrapv -isystem "C:\avrToolchain\avrLibStdCpp_230924\include"
compiler.cpp.extra_flags=-fno-sized-deallocation -Wall -Wextra

With Microchip Studio similar method modm-io/avr-libstdcpp#17 (comment)

@ SpenceKonde
The more you think about it and the more is written about it, the less I understand why changes were made to the 'news' at all. Because the changes of the compiler option std=gnu++11 on std=gnu++17 has nothing to do with the 'news'. It is therefore currently completely unclear to me what had to be changed then and why. I will try to trace the changes backwards. A test transfer of the files from the MegaCoreX did not lead to the solution.

@ all
However, you should put aside your aversion to new things. All I ever read from you is that's not possible, that's not possible, that's not possible. Although I am already much further and everything works. Except for the LibStdCpp. Therefore, you should react more open-minded and not always say no immediately.
Eventually, a solution will be found. I am sure of that. Takes time ...
Thank you up to here.

Uh, when we put it into C++17 mode, some code would no longer compile because it complained about missing new or delete methods. It may have been part of the group of related changes that came with C++... 14 according to the internet? "Sized Deallocation" - I think that was forced the news to be changed, I think. New forms of new and delete appeared then, and I think reinforcements came in C++17, and... was it unpredictable when it would or wouldn't manifest, because it may or may not choose to compile a anything into code that uses the method in question or something dumb like that. Since you have a copy of working and non-working news,just open up a merge tool and copy differences over, one function at a time until you break MCX or fix DxC, and then you'll know where in them the problem is.

Like i said, there are issues associated with some of the commits on those new files (though it could be in either DxCore or megaTinyCore - as stuff like this that is needed in both places is usually kept identical on the two cores so I can keep them in sync. - those issues have all the information I could come up with, in the person's original words. At one point, he scared up a link that would probably explain everything to someone who knew C++...

WestfW commented

All I ever read from you is that's not possible

Would you believe "that's not possible for us to do, because we don't have sufficiently deep C++ knowledge to debug a complex bit of code like libstdcpp." Speaking for myself (primarily a C and ASM programmer), the complex template coding in those .hh files is essentially line noise :-( All of that STL stuff looks like it's written in an entirely different language.
It's not clear that anyone officially associated with Arduino or any of the third-party cores has that depth of C++ experience.

WestfW commented

BTW, I get the same 'reverse' is not a member of 'std' error trying to compile a simple reverse example using Arduino for one of the ARM chips (which theoretically already have a libstdcpp), or Using GNU Arm Embedded Toolchain 10.3-2021.07 (from ARM) with up through -std=c++17, so maybe this a bug deeper in the gnu toolchain? (It does compile "native" of my x86 Mac, though. (but that's llvm, rather than actual gcc...)

#include <array>
#include <iostream>

std::array<int, 3> ar1 {33, 21, 17};
int main() {
  std::cout << ar1[0] << ar1[1]<< ar1[2];
  std::reverse(ar1.begin(), ar1.end());
  std::cout << ar1[0] << ar1[1]<< ar1[2];
}
/usr/local/gcc-arm-none-eabi-10.3-2021.07/bin/arm-none-eabi-g++ cpp.cpp -std=c++17 -Os -g
cpp.cpp: In function 'int main()':
cpp.cpp:7:8: error: 'reverse' is not a member of 'std'
    7 |   std::reverse(ar1.begin(), ar1.end());
      |        ^~~~~~~

@ WestfW
Less for understanding. You don't necessarily have to understand it. It contains almost only templates. You only have to be able to use it. Just as you apply for (...) {...} or reverse without ever having read the code behind it. The LibStdCpp is there to be used.

I don't know what always the negative statements should bring. Why do C and ASM programmers always talk negatively about the advantages of C++? What is the point? I let you program C and ASM and you let me and others program C++ please.

Your program is still missing algorithm.

#include <iostream>
#include <array>
#include <algorithm>

std::array<int, 3> ar1 {33, 21, 17};

int main() {
  for (auto &d : ar1) {
    std::cout << d << ("  ");
  }
  std::cout << std::endl;

  std::reverse(ar1.begin(), ar1.end());

  for (auto &d : ar1) {
    std::cout << d << ("  ");
  }
  std::cout << std::endl;
}

If I transfer the example to the controller I get said error. Without 'algorithm' (and without 'reverse') it works. A naked std::array works.

@ SpenceKonde
It will be a lot of work to look at the files. MegaCoreX and DxCore are not syncronous in consideration as one might assume.

I don't know anything about MCX's 'new' but I suspect it is closer to the wild form. This core is NOT a direct descendant of MCX. This core is a descendant of megaTinyCore. megaTinyCore is a descendant from the stock arduino 4809 core, not MCX, though we took a lot of features from MCX early on.

The version of new that MCX has (both files) can be freely used to replace the version that DxCore ships with, and at one time that's all we had, but there was concern that because these were possible in theory because the standard said they were, and wondered if we needed to implement them, so we put in stubs to error with an unambiguous error. If this error was found to occur, then we needed to define them, but it doesn't appear to, meaning those functions are never reachable and are elided by the compile. The new.h and new.cpp from MCX can be transplanted to DxC and I predict that doing that will result in no change in observed behavior (it produces identical binaries in my tests). If it does result in things working, well, then please let me know what the correct way to handle this is? You freely admit that you know C++ better than I do, so this is your department, not mine.

The evolution path of the cores as is relevant to new is as follows:

MegaCoreX was derived from Arduino.

megaTinyCore was, separately, derived from the Arduino core, with some influence from MCX.

Subsequently, particularly early in the mTC history, there was limited exchange of changes between it and MCX.

Automatic linting was introduced, and initially a stylechecker was run to automatically astyle everything. Then after releasing and doing further development, I looked back at basic file and wondered why I couldn't follow them, realizing only once I fixed the indentation and the tests failed, that it was that which trashed the formatting of almost every file (astyle was only once allowed to change files, since then it is only used to check, and it is applied only to libraries, bootloaders, and variants, not the core files, which get transmuted into a pile of brown sludge by astyle. So shortly after an update which turned a large number of files into non-human-readable mess of nested conditionals without differing indentation, was an update that reinstated that indentation, all done by hand.

Spellchecking with codespell was also added as a linter. Codespell has steadily improved over the years, this means that there are several changes which change only spelling in comments, when a word that was not previously recognized as misspelled became recognized (codespell, in contrast to normal spellcheckers does not check whether words are known to be correctly spelled words, but rather whether they are known to be misspelled words, and it's library of misspelled words that it can detect has grown significantly over time (since it needs to be run on code and not trip on variable/function names and stuff like that). Hence, from here on up, there may be changes in any file that consist only of spelling changes, when the linter was updated and suddenly previously passed files would fail spelling check as more misspelled words were caught (many, many more remain).

Then DxC was forked from megaTinyCore. Variants were derived from MCX variants, and then extrapolated to new pincounts.

time passed

Changes were made to the new files in response to an issue on one of the two cores This issue requested that the standard be increased to C++ 17 (we later changed to Gnu++ 17 to keep compatibility), and the changes were required in new for reasons discussed in the issue.

The updated file was ported to the other core.

At at least once, another issue led to changes - these include several cases that were missing and required (with a lengthy discussion in the issue with a lot of information in it). Additionally, there is the addition of the new without the .h, which was also a response to an issue, but it may have been at the same time as I was working on this, so it may have gone in with the same commit,

Finally there were two changes that aren't relevant here, concerning stub functions which I think are the most recent ones - there are several versions of new and delete added in C++ 14, and more in 17. After the initial fixes went it, someone tripped over a missing one, and I had added that in. But at that time (as discussed in that issue) on the same list as those appeared on were several other variations of new and delete to support overaligned allocation and deallocation. After initial attempts produced candidates for 2 of them, it was realized that we couldn't actually test the code as we'd never generated code that needed them. So, not knowing whether the code could ever be called or whether the candidate did what it was supposed to, it was decided that rather than blindly use the untested implementation, we should generate an error with badAlloc() (which we defined in that file so it had no dependency on the core). The proposed implementations were included in the file commented out. Much later, I discovered while trying to compile to a .S so I could see what the compiler was doing to mangle my inline asm, that LTO had to be disabled to make that work, and that disabling LTO broke lots of stuff I set up to detect compile-time-known-invalid values passed to functions and sometime-but-not-with-this-configuration-valid functions and give errors stating what's wrong rather than compiling and generating wrong behavior, or generating a cryptic generic error message. So I put in #ifdefs all over the place to clobber everything that didn't agree with disabled LTO (and the core was not expected to produce working code in this case - LTO_DISABLED is "guaranteed not to work", it was only there because occasionally I need to make the core compile as far as the .S with LTO disabled, )
tldr of this: The stuff in new.h with the badAlloc error and the LTO_DISABLED #ifdef can be assumed to never be called in practice, and should be getting elided by the optimizer anyway as long as LTO is enabled. LTO is never disabled and LTO_DISABLED is never defined

Since this time, DxC and mTC have been maintained in close synchrony (thus why you see no difference between them). A change that goes into one and applies to the other is typically ported within a patch version or two.

Thus: All changes. starting from when megaTinyCore was forked from the Arduino core are listed in the megaTinyCore new history. The files are identical on the two cores, and mirror eachother closely (the news are identical. Actually, so is the api directory (but it is not unmodified from the stock. The stock api directory an appalling pile of toxic rubbish, negligent bloat, and ideas that might have been okay if they were done that way from the start, ). The first commit is what I started with and I think would have just been the stock core's version.

There is really no non-trivial code in those files whatsoever. All we ever do is cast arguments to void and call some builtin.

The only code more complex than that is the commented out proposed implementations for the ones that currently give the badAlloc() error, which nobody has ever reported receiving, and which I don't think actually occurs in the real world (the badAlloc errors are mainly there specifically to determine if this was something like the deletes they added in C++14 where code that compiled just fine in C++ 11 where there was no such thing would not compile for C++14 because this new variant of delete wasn't defined. This is sneaky, as most code would compile fine, and only a small portion of sketches would trip over something that C++14 mode would build to use the new delete, doesn't find it, and fails to compile. By clarifying the nature of the error in the event that an analogous thing occurred with C++17 and it's new new/deletes, it was hoped that these different errors would give more insight to the developer who encountered them). As it has turned out - nobody actually uses those new/delete operator variants, so the errors they can display never are generated, which seems to indicate I made the right decision to not support these.

This is everything I can piece together that I know - you now know as much or more than I do about the matter of the news on mTC and DxC. There is like, nothing in that file. Did you not notice that the only actual code in new.cpp that is non-trivial is a proposed implementation that is commented out?

I will note for the record that I am baffled as to why new has to be part of the core, and why there isn't a standard implementation for each architecture - certainly prior to the overaligned versions in C++17, the implementation is truly trivial... You'd have to ask a C++ expert about the reason for that - I'm sure there is one, though I'm less than confident that it's a good one.

No unused warnings were ever observed in the context of the new functions.
The only changes consist of comment spellchecking and format linting, and adding the "stubs" for the C++17 overaligned allocation and deallocation. I think I saw one implementation, but it just cast the alignment to void and returned the result of a normal malloc call. So the pointer it returned would not be overaligned as requested. This seemed dangerous for any use case I could imagine that might think to request an overaligned allocation. Reasoning was thus straightforward: If a use case requires overalignment, either the overalignment is being requested unnecessarily, or it it's not and overalignment is relied upon by the program. In the former case, the program is requesting overalignment it doesn't need, which in turn interferes with efficient memory allocation - the code should be corrected (but then why was the alignment requested? that's not something you add by accident (to my knowledge, based on the absence of complaints and intuitive reasoning - though we had initially feared it might be - it doesn't appear to manifest in user code (and you;re not reporting a badAlloc either, so you're not hitting the error trap)).

If the program is not lying when they say they need the pointer overaligned, we know that the new/delete operator that they requested will be called at compile time. If we can't give it an overaligned pointer, we can either give it an unaligned pointer (unless the aligned allocation is requested inappropriately, however, this constitutes generating a binary that we know will not work most of the time, which is a violation of the design goals of the core: Error conditions that can be recognized at compiletime should be whenever possible because we have no exceptions and poor programming practices are endemic). We could return a null pointer, but there's a distinction from normal calls that can return a null pointer - functions or operators that can return a null pointer are normal and fine. Those that can only return a null pointer because they are not implemented - that should not pass through the compile process silently, right? (remember, we can't generate warnings - the warning attribute does not appear to work - except through the preprocessor, and the preprocessor doesn't know which functions are reachable, so a #warning would be displayed every compile, rather than just compiles that manifested one of these cases where we don't have an implementation). So I think this is also excluded by the "don't generate code you know won't work" rule. The third option, which I would think is ideal, would be to have a working implementation, but we don't have one of those nor am I confident enough in what correct behavior is that I would be comfortable implementing it without a reference for what it's supposed to do and a test case so i knew if I did it - So I lack both of the requirements, and "return an overaligned pointer as the caller requested" is off the table. Returning anything that isn't what was requested is clearly wrong, and we don't know how to do it correctly nor how to test that it has been done correctly if we tried, so I think the least-wrong thing to do here is to error out, which thus is what I do.

Obviously, all these problems would vanish if I had a trustworthy implementation, or an implementation and test case to show that I should trust the implementation. Alternately, it is possible that the entire venture was misguided, and it may be that those operators shouldn't be defined by the core. I remember it being very hard to find information for someone in my position, having to create new.cpp without being an expert in c++. It seems like something awfully basic and esoteric to not be a part of the compiler package - why must we synthesize this?! . There was no resource that says "These operators need to be defined in these cases and here are the constraints they must abide by". If the C++ tradition is for one new.cpp file supplied by some architecture specific package and then a "standard library" that defines the C++17 ones by referring to the other ones and that these definitions should not be there. I would believe that. I don't know! Neither did the guy who opened the issue that led to that. The assumption seems to be that anyone in the position of having to create a new.cpp already knows these things, but I'm not a C++ person; I am completely out of my league. In fact not just that, I don't like dynamic allocation at all in any embedded context, it's repulsive and inappropriate.

Of course I use parts with less than 32k much more often than ones with more than 32k. One is always going to be more careful with flash size if you do work on 4k parts and 16k parts routinely. And if one is maintaining a core, one certainly should be careful about flash. Before I developed these instincts, people complained about code size and bloat. But that also colors my opinion of things (like dynamic allocation, and like a lot of C++ constructions). Things that result in larger binaries are bad, Things that make assembler listings hard to read are bad, because that's what I look at when debugging most of the time.

Thanks for the detailed answer. I will have to read that several times. :-)

An short answer to the question about new and core.

Normally everyone has the LibStdCpp available. Only for AVR there is not this original. new and delete is nevertheless needed. So one has separated them out and made them available separately. Why the AVR group at gcc decided against the LibStdCpp at that time I don't know. Apparently they thought that nobody would use it on the small controller. At that time perhaps understandable, today everyone is annoyed about the fact that the LibStdCpp is missing. And now there are "modm-io" people who have reprogrammed it for AVR and make it available.

Everyone is? Nope. Not true. There are a lot of bad decisions made by the avr-gcc folks (though out of these three, there are the fewest things I can blame them for, as most deficiencies of the compiler are not poor decisions, but attributable to a simple lack of resources, which microchip as the sole supplier of AVRs really ought to be providing), by arduino, and by Microchip.
Arduino has had 2.x for over a year now I think without releasing a usable build. Does that annoy me? HELL YES
Microchip renames bitfields and registers to change one abbreviation into another (FREQSEL turned into FRQSEL - I guess they were worried about running out of capital E's for the Ex-series?). That is intensely annoying.
Arduino's official core is horribad. That annoys me (using a 328p feels like I've gone back in time for two reasons). And they've never missed a chance to come up with a fever dream of a terrible pinmapping. That annoys me.
Microchip doesn't seem to do die revs, despite having products with in excess of 30 errata. That is infuriating!

LibStdCpp? Hadn't heard of it before this issue, so it's not annoying to me. At least it wasn't until someone created an issue about it. If "Everyone except Spence" is annoyed y the lack of LibStdCpp, that's hard to reconcile with the fact that out of the combined total of 2500 issues, this is the first one that asks about it.

Translate that as "many people." There is still a world outside your DxCore.

Of course. But we're speaking in the context of mTC and DxC. These are the DxCore/mTC issues! (as noted, they're mostly the same). So, the relevant population who we need to be considering is markedly more constrained: Only persons who are developing, planning to develop, or considering developing embedded software for modern AVRs using the Arduinio IDE or compatible development tools will be impacted. So while the scope is a little larger than DxCore, it is not dramatically so. Arguments from anyone outside of that group should be disregarded. Some guy who writes C++ for x64 all day, but doesn't actually program embedded devices or plan to - and everyone else shouting criticism from the peanut gallery - should not be listened to, right? They don't understand the tradeoffs involved, and will likely have no concept of how constraining the resource requirements are for these parts.

This would be the first issue where I was able to ascertain any specific motivation for considering the compiler version and expressing displeasure over the matter. We sometimes get people who've updated the toolchain to some newer version and are having some weird issue or another - but I think there are some people using it successfully on much newer versions. One striking thing about this is that none of them have mentioned much in the way of potentially compelling reasons for that (except for the guy who's been modifying his toolchain - he didn't need any help though). By far the most common reason seems to be people who assume that bigger version number = smaller faster code. My understanding is that that was the case in the distant past with avr-gcc 4 being a big improvement. But that trend comes to a halt in 5.x, fumbles 6, and then lurched forward once more by virtue of getting LTO from gcc in 7.x, then an arm reached out of the quagmire of low performance and grabbed it's leg, it stumbled and collapsed, and with every version it's been pulled further back into the muck, at a seemingly accellerating rate. In the latest versions it appears that literally nobody is checking avr-specific stuff. I hear the latest ones spam warnings over bog standard constructions that are correct and recommended in AVR and used by the manufacturer in the I/O headers get warnings (because of they imply that 0 a valid pointer to write to, which it is on AVR but almost nothing else). According to a thread on it, there is a simple option in the configuration for a platform that can be changed for avr that gets rid of it. The fact that that was not done for at least two versions and may still not be says something. Specifically it says "ain't nobody maintaining the avr in avr-gcc!"

As an aside, the barrier to getting me to move to a newer version would be very high. Because, before we start talking about merits of the change, I would need to know that it was not just an academic discussion about something that we will never have. I would need either a set of all toolchain packages for all the Arduino platforms (which could be dismembered like the arduino7 one to "update" the toolchain, or an AWS AMI image for a debian linux environment that I can use to build newer versions of the toolchain and crosscompile for Windows, x64, linux x86, arm32 and arm64 and mac x64. In the former case, I would still need tobe able to build for Linux x64 since that's part of the process we use to generate the files we need in the toolchain - but it's a much lower bar than a crosscompilation. This is apparently a major barrier - this is a dark art.

Because a change to the compiler version would likely have dependencies elsewhere in the core, and because to have any confidence that a reported bug was or was not related to the compiler version and respond to issues effectively, we would need to have the same compiler version used for everyone (this would also allow more automation to be used in analysis of the impact on flash size, which would in any case need to be conducted), and we have people on all of the platforms.

So if you were entertaining any hope of compiler version updates - that is what the two challenging prerequisites are: We need to see that it on average produces code no worse than the current version

AND WE NEED TO BE ABLE TO GENERATE THE FILES.

Currently I cannot generate working binaries for any platforms, not even the one I build on, that's why I butcher arduino7 to get binaries.

No further arguments in favor of this or indication of anyone knowing a way to make the binaries; without that, the whole discussion is never going to come to anything