Curation as an act of journalism

In September 2020, I started a site called Portland Protest News. Its purpose was to be a daily collection of what I considered to be the best reporting and analysis over the past 24 hours on the subject of the protests in Portland, Oregon. The site focused primarily on local news sources, as I considered those the people closest to the action. However, I included links to posts/articles from other sources if they were bringing out aspects of the protests not covered in the local media.

In creating and running this site, I was not generating any original reporting on events concerning the Portland protests. However, I feel that my activity did fit within Wikipedia’s definition of journalism:

Journalism is the production and distribution of reports on current events based on facts and supported with proof or evidence.

I was filling the “distribution” role in the above definition. However, I was also performing these roles (per Mindy McAdams):

  1. Selection of the best representatives: If a museum curator has access to 10,000 small clay tokens from ancient Iraq and Syria, how many — and which ones — should appear inside the glass case? If a journalist is going to provide links to reliable sources about planning for retirement (or breast cancer, or choosing a college), which are the best, clearest, and most up-to-date?
  2. Culling: How many links is enough, and not too much? If the museum curator puts 100 of those tokens in one case, my eyes will glaze over.
  3. Provide context: Will you include a bit of explanatory text to show me how each source differs from the others? Why am I looking at this one? Where is it from? How old is it? Why is this one significant?

The Online Journalism Blog also touches on curation:

Curation is a relatively new term in journalism, but the practice is as old as journalism itself. Every act of journalism is an act of curation: think of how a news report or feature selects and combines elements from a range of sources (first hand sources, background facts, first or second hand colour). Not only that: every act of publishing is, too: selecting and combining different types of content to ensure a news or content ‘mix’.

Where I am going with this? Here is my point: when someone collects links on a topic, that is an act of journalism. It is a type of journalism that all of us can pursue and make contributions.

The process of democracy

Within the past few weeks, the citizens of the United States of America have had the opportunity to follow the election process across the 50 states and the interaction of the states with the federal government. Reports on these processes from media organizations and social media have been the main source of how people have been able to monitor these processes. However, I had two experiences since Election Day that demonstrated that being at an event or being able to observe an event live can provide much more information than the summary presented from a media organization.

The first instance was a press conference by President-Elect Joe Biden in Wilmington, Delaware. This was carried live by ABC News (where I saw it), and the experience was a breath of fresh air – indeed, a whole tank of fresh air! Kamala Harris gave an address, followed by Joe Biden. These addresses were clear and understandable. Afterwards, Joe Biden took questions from reporters. The phrase that accurately described it was “presidential”. Biden was polite and thoughtful in his replies to questions; he was honest in saying “I don’t know” when he did not know the answer; he was straightforward in explaining how he could not answer some questions as he had not been inaugurated as President, and he responded in a respectful way even when reporters were clearly tying to bait him into responding in a rash way. The experience was one that I have not seen in over four years.

The second instance was a livestream presentation of the meeting of the Michigan Board of State Canvassers held on Monday, November 23rd. I watched the initial part of this meeting via a link from the Detroit Free Press, although I found later there were other links to the Youtube live video. Through watching this livestream, I had a front-row seat to listen to the statements from the board members, testimony from the state elections director, multiple clerks and election officials, and a number of private individuals. I was impressed and proud to hear how many individuals contributed in carrying out the election in Michigan. I was disgusted to see how board member Norman Shinkle treated a number of the people giving testimony. I was impressed with how board member Aaron Van Langevelde asked questions in an attempt to make an informed decision about what actions the board should take. I was pleased with the other two board members, Jeannette Bradshaw and Julie Matuzak, and their responses to the situation. I had to invest several hours to participate in this way, but I received a richness of information that, again, was not available from any other news source.

After these experiences, I was filled with pride in the people who serve our country in the electoral process and the work of the courts in resolving disputes. I was also proud to see first-hand the President and Vice-President that will begin leading our nation in January 2021. Although my participation in this election was solely to vote, I am going to look for ways to get more involved in the political process of my city, state, and country. I encourage all citizens of the United States of America to do the same, to help in healing the divide that affects our nation.

 

A dog’s life

Here is our dog Joey taking his post-barking-at-the-latest-delivery-person nap:

In honor of President-elect Joseph Biden’s victory, I have started calling Joey “Mr. President” at various times during the day.

One of my kids sent me a video of a dog that looks just like Joey (click this Reddit link), Joey would not be able to fit inside a dryer bin.

nodeStorage: Feedback on new migration changes (affects 1999.io)

I wrote a post recently about a 1999.io server that I set up for a friend. That post documented a problem that happened when I initially set up the server. A few weeks ago, he wrote me to say that he was getting 502 Bad Gateway when he tried to access his site. After some review, it appeared that the 502 Bad Gateway was due to the fact that the server process had stopped for some reason. I went to restart the server, but got an error immediately:

Cannot find module ./main.js

After some more review, I saw that the nodeStorage repo had been updated in October (https://github.com/scripting/nodeStorage). This is the backend of the 1999.io blogging tool. I remembered that in the past, some of these apps would try to update themselves periodically. From looking at the installation directory, this appeared to be the case. The update was to create a NPM package for nodeStorage (it appears this was to help with making future updates easier). I agree that the manner of updating 1999.io in the past has been somewhat awkward, so I hope this will improve the environment for future updates. However, I wanted to get my friend’s server working again, so I started trying to figure out how to fix things as quickly as I could.

The description for the new update process (https://github.com/scripting/nodeStorage/blob/master/package.md) had the following steps:

  1. Download the package.
  2. The code you need is in the example folder. Copy the two files into your app’s folder.
  3. Edit app.js to the name of your app, and update package.json accordingly.
  4. You can delete the other files.
  5. At the command line, enter npm install.
  6. You still have to have a config.json file as before.

I began to work through these steps, here are my notes:

  1. Download the package.

Since I did not think my install was ready to do the ‘npm update’ type solution (future state), I instead downloaded a Zip file of the current nodeStorage repo (https://github.com/scripting/nodeStorage/archive/master.zip) and unzipped the file.

2. The code you need is in the example folder. Copy the two files into your app’s folder.

I reviewed the unzipped file and saw that there was a folder called “example” and it contained two files (app.js and package.json). I copied these files into my existing nodeStorage directory.

3. Edit app.js to the name of your app, and update package.json accordingly.

I decided that I wanted to keep the storage.js app the same, so I changed the app name in app.js from “nodestorage” to “storage” (which corresponds to storage.js). I decided to keep the current package.json for now.

4. You can delete the other files.

I decided to delete the npm_modules directory within the nodeStorage directory (due to step 5)

5. At the command line, enter npm install.

I executed this command from the nodeStorage directory

6. You still have to have a config.json file as before.

I had a config.json file, so left it as is.

I then started the server using the command “node storage.js”, and the server appeared to come up correctly. I then stopped it and started again using the command “forever start -a storage.js”

I then repeated this operation with a second 1999.io install, except that I left app.js and package.json from the example folder as-is. I then used the command “node app.js” and saw the server come up correctly. Finally, I stopped it and started again using the command “forever start -a app.js”, with everything working normally.

Conclusions/suggestions:

  1. Recommend changing step 1 to say “getting the nodeStorage repo Zip file” rather than “Download the package”, as this gives the impression that you are supposed to issue some NPM command right at the start.
  2. Be more specific about what files could/should be deleted in step 4. I deleted the node_modules directory, but perhaps other files could be deleted as well.

Bookmarked The Code That Controls Your Money (wealthsimple.com)
Another in a series of articles I have seen about COBOL, definitely includes some nice history and examples of current use of COBOL in the US financial industry.

1999.io: How to make sure posts get into the RSS feed

Recently, I helped a friend (Ron Chester) set up a 1999.io instance for a blog on ham radio. In his initial posts, however, he did not see the posts appear in the RSS feed for the site. I did some debugging (will go through that in a post some other time) and found that the problem was due to a setting within 1999.io. There was a reference on the 1999 blog for Facebook Instant Article support, saying that the checkbox needed to be checked for IA support.

In the default install, this box was checked. As a result, if a user posted something without a title, it would not appear in the RSS feed (the 1999 blog post linked above states this). To make sure that all posts (with or without titles) appear in the RSS feed, this box should be unchecked, as shown in the above picture.

 

I am back

For my five loyal readers, I wanted to let you know that I am returning to active posting on this site. I have been spending the majority of my writing time on my Portland Protest News site, but I am drawing that to a close. Another obstacle was a unforseen hospital stay a few weeks ago, but I have pretty much recovered from that, so I think it is time to jump back in. I have a few posts on the 1999 blog tool to get out, as well as some other items of note, so sit back, relax, and enjoy the ride!