Pages: 1 2 3 4 5 6 7 8 9 10 11
Posted on 19-10-25, 09:58 in Making interactive fiction (revision 1)
Post: #101 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by Screwtape
Something that requires a PHP backend might make it easier to for people to run on older, simpler devices without JS, but I feel like these days there's a lot more people without reliable network connectivity than without JS support, and I really like the idea of sticking a static HTML site on a webserver that people can save and run offline, or even make it a progressive web app so people can install it on their phones without me having to package it separately for each OS and device.


Yeah, that was the basic idea - a plain old HTML site. JavaScript would be required for variable holding with inventory, and either a multi-page-in-single-doc layout or cookies for moving around with said inventory (basically, just a 64-bit or 128-bit number with flags and keys). Good for PWAs as well.

Posted by sureanem
What do you need the JS for anyway? If you're making a point-and-click game, I think good ol' CSS can do anything. Failing that, there's always image maps.


Like I said - HTML and CSS can do a lot of things, but you cannot implement a key-and-locks system in them (if use key on door show open_door.png; else show closed_door.png), since you need some sort of variable storage. If you can I dare you to do a CSSZenGarden design for it. ;) A lock does not have to need a key, either, you could simply alter something in the environment (like flooding a chamber to raise a floating platform). The "key" then becomes the button that activates this chamber.

However, it is possible to make the JS completely transparent so you only author CSS and HTML using certain key classes. It would be a bit limited for sure, but doable, and the JS would be simple checks. Like I said, aiming for a more Myst / CoMI experience than Zork.
Posted on 19-10-25, 13:12 in Stupid computer bullcrap we put up with. (revision 1)
Post: #102 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
More bullcrap from a Windows environment;

Just installed Visual Studio 2019 professional edition.

It only ate 18 GB of my precious 120 GB boot drive (C++, .NET and Universal apps).

Meanwhile my Linux build environment sans the project build folder is a whopping 9 GB in total. That's the entire OS minus my home directory.

What the actual fricken heck is going on???
Posted on 19-10-31, 23:29 in Stupid computer bullcrap we put up with.
Post: #103 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by CaptainJistuce
Vim is antiquated garbage.


* wertigon looks at his emacs setup containing evil, magit and org-mode

... *sigh*

* wertigon feels old and antique
Posted on 19-11-01, 07:59 in Stupid computer bullcrap we put up with.
Post: #104 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by BearOso

The biggest thing I love with newer editors is Language Server Protocol support, or "Intellisense." Nothing saves time like having the API documentation pop right up without having to look it up, or having your compiler parse everything as you go and correct typos. It's even standardized and can be included in basically any editor you want.


This is very true, and I actually use it frequently in emacs. The thing that keeps pushing me back to emacs are pretty much two things;

* Best vim emulator I've yet to encounter (evil) - it really, truly, actually work just like vim
* The customizability of the tool. Nothing even comes close.

I also didn't get the point of vi for the longest time. All these weird idiotic combos, two editing modes I mean what for??? and GAH why did I press ctrl+s and why did vim suddenly freeze? WHAT GARBAGE IS THIS?????? >_<

Then I saw a video that explained it to me. I was approaching the whole thing with the wrong mindset. vim isn't as much of an editor, as an editing language. And once you make that connection, you start speaking it. dw to delete current word, y2w to copy the next two words, and so on. Combine this with paste buffers and keypress recordings, and repetitive tasks become really easy to perform.

Now, combine the power of that with a fully customizable editor that you can configure every single painstaking detail off, and well... It is tempting, truly tempting. I feel more productive editing text. Not sure if I actually am though, especially since the time spent toying with the emacs config is probably greater than the time you lose by using a less productive method to achieve the same task. And while emacs allows full customizability, it also forces you to customize, for better or worse. Modern emacs is a powerhouse, but only if you're willing to spend the time to make it a powerhouse.

For the record, what I'm looking for in an editor:

* Responsiveness
* Customizability
* Productivity features

Only editor that ticks all three is emacs right now, customizable GUI editors usually aren't responsive or customizable but have awesome productivity features. emacs has more clunky productivity features bolted on as an afterthought, but is extremely responsive and customizable. On a scale of 1-15 Visual Studio / Code is something like an 11 and emacs is a 13 for me.
Posted on 19-11-07, 22:37 in Games You Played Today REVENGEANCE
Post: #105 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by Broseph
Been trying a few SMW hacks. Super Mario Bros. 1X is quite good. (Video)

It's a SMW hack but it's done in the style of All-Stars graphically and features levels and gameplay mechanics from a whole bunch of Mario games (including levels straight out of Super Mario Land and SMB2US). It starts out with a fairly normal degree of difficulty (about the same as the original games) but gets progressively harder and by W-6 or 7 it starts to be relatively challenging but never frustratingly so.


Wait... SMW hacks work on an snes emulator now?

(as opposed to a zsnes emulator)
Posted on 19-11-08, 11:37 in Games You Played Today REVENGEANCE
Post: #106 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by Broseph
Oh yeah, at least the good ones do. This hack works on bsnes, higan and Snes9x (and should work on hardware obviously). The "hack was only tested on ZSNES" era is gone for good it seems. I think smwcentral will indicate when a hack has issues on hardware(/good SNES emulators). Not sure if they even accept new submissions where it only works on ZSNES.


... Dammit, I missed that moment when hell froze over then! :P Also:

Posted by Broseph
Terranigma


Aaaah, such a great game from my childhood. And the nightmares I had from the sexy looks thrown by those monsters... :)
Posted on 19-11-14, 10:40 in Stupid computer bullcrap we put up with.
Post: #107 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by kode54
This is an AMD Hackintosh


Posted by kode54
I can’t believe people put up with this shit.


People in general don't. They just buy Apple products that Just Work(tm). Very few actually bother with hackintoshes, I'd say the number of people doing it is less than 20 000 or something ridiculous like that. (Apple base is over a billion users)

Also, the Just Work is more like Just Doesn't Work, unless your entire hardware lineup is deliciously expensive Apple, but I digress.
Posted on 19-11-15, 14:33 in Games You Played Today REVENGEANCE
Post: #108 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by Kawa
I'm given to understand that this is a Switch problem, not a SwSh problem, related to poor ExFAT support on the system itself. This has reportedly also happened while playing Cadence of Hyrule, Breath of the Wild, and possibly others I can't recall, with the one common factor being ExFAT-formatted SD cards.


And that in turn is due to Microsoft refusing to release proper documentation for their file system. Wee! :D

Solution; Let the Switch format your SD card instead. Better compatibility.
Post: #109 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
This is freakingly freaking awesome!

https://www.contrib.andrew.cmu.edu/~somlo/BTCP/

"My goal is to build a Free/OpenSource computer from the ground up, so I may completely trust that the entire hardware+software system's behavior is 100% attributable to its fully available HDL (Hardware Description Language) and Software sources.
More importantly, I need all the compilers and associated toolchains involved in building the overall system (from HDL and Software sources) to be Free/OpenSource, and to be themselves buildable and runnable on the computer system being described. In other words, I need a self-hosting Free/OpenSource hardware+software stack!"
Post: #110 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by sureanem
That's interesting, but how does he know the FPGA doesn't have any backdoors?


...

You really have no clue how an FPGA works, right? It's pretty much impossible to put a back door into the FPGA itself, since it does not have any persistent silicon or memory. It does not even have an I/O die.

Granted, you could do the good old compiler payload trick, but the FPGA in and of itself... Nope.
Post: #111 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by sureanem
How do you know that though? It's an integrated circuit, it's not possible to audit it. They could burn literally anything they please into it and you've have no way of telling, short of decapping it.


FTFA: "The FPGA is a regular grid of identical components, so (destructive) visual inspection (i.e., chemical ablation and TEM imaging) is more feasible than with a dedicated ASIC that has much less visual regularity and repeatability."

You can never be 100% safe. But you can be reasonably safe FPGAs are tamper-proof.
Post: #112 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by sureanem
Yes, but it's hardly cheap to do so, and you could very well be prevented under legal limitations.


Decaps are actually fairly cheap... For corporations and countries, both whom are interested in getting this kind of verifiability. For you and me, not so much. :)
Posted on 19-11-20, 14:06 in Linux + FPGA + RISC-V = self-hosting libre hw/sw stack (revision 1)
Post: #113 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Well, it is possible to build an FPGA at home, thankfully. All the pieces are there, and sure the prototype will be freakishly large, but you can use that FPGA to validate the input and output of the smaller one: http://blog.notdot.net/2012/10/Build-your-own-FPGA

Sure, it requires like ten breadboards and lots of cables and spare time. But if you are that paranoid, man gotta do what man gotta do! As for bootstrapping the FPGA, start with machine code and keep going from there. :)

Thankfully, since the industry is trending more and more towards Linux, support for that OS is more and more common, as well. See for instance, Lattice: https://wiki.debian.org/FPGA/Lattice

And of course, it is possible to print a CPU today already, it will just take a lot of space: https://www.researchgate.net/publication/282544864_3D-printed_microelectronics_for_integrated_circuitry_and_passive_wireless_sensors
Post: #114 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by sureanem

I can't imagine it would be too hard [to introduce hardware backdoors].


You still do not understand what an FPGA is and how it works.

An FPGA consists of two things. Transistors and memory (actually, registers). All memory is writable. Every last ounce of it. If it isn't, this is immediately detected. If there is anything else than transistors and writable memory in there, the pathways will be wired differently and again, it will stick out like a sore thumb when inspecting at factory, and any outside factory tampering is pretty easy to spot.

Sure, you can do it. But it would be immediately noticed, and defeating computer security is all about being stealthy - else the hole will be immediately patched. What you are arguing for now is that US prisons aren't safe because earthquakes could tear the prison wall up. Or a bulldozer or wrecking ball could tear down the walls. That does not mean the prison is insecure.
Posted on 19-11-22, 11:08 in Stupid computer bullcrap we put up with.
Post: #115 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
For OpenGL, these days, MESA seems to be moving to OpenGL-Over-Vulkan anyway. So in the future the only necessary API to implement will be Vulkan - and that is much easier to implement than OpenGL. A win-win deal if you ask me! :)

Not to mention AMD will power the next gen stationary consoles (since Nintendo has basically left the dedicated stationary consoles behind), with 4k/8k/16k gaming and all that nice stuff. And yes, 160 Hz @ 16k is pretty much as good a resolution as you will ever get, anything beyond that is pointless for a TV- or computer monitor. You'd have to project the image on a surface 9 by 16 meters and sit like, 1 meter from that screen to even notice the pixels.
Posted on 19-12-10, 12:20 in New Realtek website... where are the audio drivers!?? (revision 3)
Post: #116 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Would just like to chime in here - I recently upgraded my 2010 era computer to this on Black Friday:

Case - Kolink Satellite (~$25)
PSU - 400W Corsair (recycled, probably want to replace it soon)
Motherboard - Gigabyte B450I Aorus Pro WiFi (~$100)
CPU - AMD Ryzen 5 3400G (~$110)
RAM - 16GB Corsair Vengeance LPX 3200 MHz (~$75)
SSD - Kingston A2000 512 GB NVMe gen 3 (~$55)

Total price: ~$365
And a similar build for PC Part Picker: https://pcpartpicker.com/list/TMR7Rk

You want the better rated PSU if you plan on getting a better GPU in the future, and the case is slightly more expensive - rest is down to black friday deals sweetening the cost for me. :)

My system runs Linux without a hitch, has 4 cores / 8 threads, I can game perfectly fine on it, and it is in general an awesome budget build. Latest stock Ubuntu (19.10) worked right out of the box, no configuration necessary. Not good enough for 1440p gaming, but 1080p runs pretty good and 720p is great, and I can always upgrade to an RX 5500 XT or 2060 Super later. Heck, the CPU should even be capable of a 5700 XT in theory, though might need an overclock. Simply put, this package is great value for the money right now. :)

[edit]Oh and in regards to 3600 vs 3700X - how much do you need two extra cores? If all you do is game, go with the 3600. If you stream or do other CPU intensive stuff, go with the 3700X. There is absolutely no reason to go above 3600 if you are building a Lintendo / Wintendo Playstation :)[/edit]
Post: #117 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Posted by BearOso
Given his post history, I think he’d be happier with the 3700x. He’s going to want to have the extra single-thread performance.

I would have gone for a x470 motherboard because I know the stupid tiny chipset fans on the x570 would annoy me, and the b450’s choke buzzing would be more audible, too.

I’m looking at RAM prices, and 16gb 3600MHz is only about $5 more than 3200MHz right now, so the bump is worth it.


- RAM prices, I agree it's worth to stretch for the 3600MHz RAM - if your motherboard can support it.

- Motherboard, the MSI B450 Tomahawk MAX has enough VRM performance to handle the 3700X for sure (though may not be fit for overclocking). Feature comparison of the different chipsets:

* B450 - PCIe 3.0, USB 3.1, max RAM speed 3466 MHz
* X470 - PCIe 3.0, USB 3.1, max RAM speed 3466 MHz, dual GPU full lane support
* X570 - PCIe 4.0, USB 3.2, max RAM speed 4666 MHz, dual GPU full lane support

Add to that, the x570 is only slightly more expensive than the x470, and the x570 makes more sense. Therefore it's between B450 and X570.

Now, if you only need a single expansion card, I'd go with the Asus x570 Crosshair VIII Impact, which allows you to cram two PCIe 4.0 NVMe disks in there, and has the best VRM set on the market right now as well as a lot of other support. At $430, it's steep, but it will be well worth it since this board can handle everything you will ever throw at it including extreme overclocking, and still cost less than the fullblown Crosshair VIII Hero.

But if you do not plan on overclocking and try your best to melt your CPU into a puddle of silicon and glorious metal, go with the MSI B450 Tomahawk MAX.

As for CPUs, the 3700X is not a bad CPU, but it will not do much more for you if your goal is to game, we're talking a 2%-3% performance increase for $200. If the main purpose is gaming plus streaming plus compiling stuff, then yes, the 3700X is awesome, and I wish I had one as a developer. :)

Posted by creaothceann
The ASRock X570 Aqua (90-MXBAZ0-A0UAYZ) don't have chipset fans


True, but it costs quite a bit more AND require watercooling.
Posted on 20-01-15, 14:19 in Games You Played Today REVENGEANCE (revision 1)
Post: #118 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
Finally got a decent game for X-mas. Now playing through Fire Emblem: Three Houses on the Switch and having an absolute blast. Just finished the mission where the Death Knight makes his second appearance, but I totally butchered the pacing in that mission not to mention I missed out on valuable training lessons (because noone says wait 'til the end of the month... Bah, humbug!), so I'm probably redoing it tonight.

This game is ridiculously addictive, even for a FE title. :)
Posted on 20-01-23, 08:55 in How to phone app?
Post: #119 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
The conversion is super simple to do actually with a clear and concise pattern, 1.4 * 1m = 1m 24s.

Since function is linear, we can rewrite it as thus: 1.4*x minutes = x minutes + 24*x seconds = (60 + 24)*x seconds = 84*x seconds

So, see the table below what that looks like. One interesting point to note is that for every five minutes you add 2 minutes and it repeats every five minutes - hence you only need to know the seconds for the first five minutes - which is pretty easy to keep track of. :)

This is the power of math, people!

 0.5m =  42s =  0m 42s 
1.0m = 84s = 1m 24s
1.5m = 126s = 2m 6s
2.0m = 168s = 2m 48s
2.5m = 210s = 3m 30s
3.0m = 252s = 4m 12s
3.5m = 294s = 4m 54s
4.0m = 336s = 5m 36s
4.5m = 378s = 6m 18s
5.0m = 420s = 7m 0s
5.5m = 462s = 7m 42s
6.0m = 504s = 8m 24s
6.5m = 546s = 9m 6s
7.0m = 588s = 9m 48s
7.5m = 630s = 10m 30s
8.0m = 672s = 11m 12s
8.5m = 714s = 11m 54s
9.0m = 756s = 12m 36s
9.5m = 798s = 13m 18s
10.0m = 840s = 14m 0s
Posted on 20-02-19, 14:38 in Semaphore timing issues in Linux (revision 1)
Post: #120 of 203
Since: 11-24-18

Last post: 10 days
Last view: 6 days
O.k. so I create a program that spawns a thread, takes a lock, sleep for five and then release. The thread, meanwhile, waits for 1 second before waiting 3 seconds for the lock to disappear.

This is apparently impossible to do with CLOCK_MONOTONIC, and it frustrates me to no end that I cannot do this, because this means I cannot sync my systems time clock and have a reliable delay (consider: DST, I put a delay on 0.5s, but since DST just changed that delay becomes 3600.5s instead...)

Here is my code so far, does anyone know how to change this?

Expected output is that main thread create child thread, takes the lock and waits for 5 seconds before giving it back.

Meanwhile the child thread should be delayed for 3 seconds if mutex is taken. If you comment out the mutex-taking, it should return after 1s. In both instances it sticks with 3 seconds.


#include <stdio.h>
#include <pthread.h>
#include <unistd.h>
#include <errno.h>

pthread_mutex_t lock;
struct timespec first;
struct timespec current;

void timeoutThread(void* args) {
int err;
pthread_condattr_t attr;
pthread_cond_t cond;
struct timespec timeout;

clock_gettime(CLOCK_MONOTONIC, &timeout);
timeout.tv_sec += 3;

pthread_condattr_init(&attr);
pthread_condattr_setclock(&attr, CLOCK_MONOTONIC);
pthread_cond_init(&cond, &attr);

usleep(1000000);
err = pthread_cond_timedwait(&cond, &lock, &time);
clock_gettime(CLOCK_MONOTONIC, ¤t);
printf("[%d.%d] ", (int)(current.tv_sec - start.tv_sec), (int)(current.tv_nsec - start.tv_nsec));
switch(err) {
case ETIMEDOUT: printf("Timeout expired\n"); break;
case EINVAL: perror("Error: "); break;
default: printf("Took it in time\n"); break;
}
}

int main(void) {
pthread_t handle;

pthread_mutex_init(&lock, NULL);

clock_gettime(CLOCK_MONOTONIC, &start);
clock_gettime(CLOCK_MONOTONIC, ¤t);

printf("[%d.%d] ", (int)(current.tv_sec - start.tv_sec), (int)(current.tv_nsec - start.tv_nsec));
printf("Start\n");

pthread_create(&handle, NULL, (void*)timedThread, NULL);

pthread_mutex_lock(&lock);
usleep(5000000);
pthread_mutex_unlock(&lock);

clock_gettime(CLOCK_MONOTONIC, ¤t);
printf("[%d.%d] ", (int)(current.tv_sec - start.tv_sec), (int)(current.tv_nsec - start.tv_nsec));
printf("Done\n");

return 0;
}
Pages: 1 2 3 4 5 6 7 8 9 10 11
    Main » wertigon » List of posts
    Yes, it's an ad.