• @[email protected]
    link
    fedilink
    English
    941 year ago

    Wouldn’t most development programs tell them the syntax error on the line and column where you replaced the semicolon

    This wouldn’t work

  • Brickardo
    link
    fedilink
    711 year ago

    Why, of all possible languages, would you suggest this for Javascript where semicolons are not mandatory

    • @[email protected]
      link
      fedilink
      English
      221 year ago

      Not only that, the interpreter will point directly to the line of code and possibly to the exact character that is the problem. Any programmer worth anything would find the issue or, worst case, retype the line of code and have the problem fixed rather quickly. “Illegal character” is a pretty easy error to diagnose.

      But…I still chuckled a little at the intent of the joke. I’m sure there are better pranks one could come up with, though.

  • @[email protected]
    link
    fedilink
    English
    291 year ago

    Meanwhile in VS Code: hey, I see this Unicode symbol that’s confused for this expected symbol, would you like to replace it?

    • @[email protected]
      link
      fedilink
      21 year ago

      BOM U+FEFF is another fun one, most editors won’t show it but it can cause errors like when I found one in a SQL script that was combined from existing utf-8 files together with cat. You’ll see it in a hex editor or in notepad I think it just made the rest of the line italic.

  • @[email protected]
    link
    fedilink
    201 year ago

    That’s too evil. At my work people like to put a tape under someone’s mouse and it can be pretty funny.

  • @[email protected]
    link
    fedilink
    191 year ago

    MSVC supports unicode. In C or C++, you could try:

    #define ; ;

    Second one is the greek semicolon but the client I’m using may strip it out. I’m too lazy to try.

    • PM_ME_VINTAGE_30S [he/him]
      link
      fedilink
      English
      31 year ago

      Running #define ; anything yields error: macro names must be identifiers for both C and C++ in an online compiler. So I don’t think the compiler will let you redefine the semicolon.

      • @[email protected]
        link
        fedilink
        31 year ago

        Haha. Thanks for checking. Given the C pre-processor, I’m sure there’s a way to maliciously bork it if someone sets their mind to it.

        • PM_ME_VINTAGE_30S [he/him]
          link
          fedilink
          English
          2
          edit-2
          1 year ago

          Well I just tried #define int void in C and C++ before a “hello world” program. C++ catches it because main() has to be an int, but C doesn’t care. I think it is because C just treats main() as an int by default; older books on C don’t even include the “int” part of “int main()” because it’s not strictly necessary.

          #define int void replaces all ints with type void, which is typically used to write functions with no return value.

    • @[email protected]
      link
      fedilink
      11 year ago

      I’m not sure but I think the second one looks just a tiny bit different, so it should have worked.

    • @[email protected]OP
      link
      fedilink
      5
      edit-2
      1 year ago

      What? Remapping your keyboard? Well, there are worst jokes, I still remember that time when the first PCs appeared and you could try them in shopping centers. They quickly stopped doing this, because some went into the BIOS and disabled the keyboard or worse, the screen, Hail Satan. Good ol’ times 😏