Learn how to join our server and start playing in 60 seconds!
Play Now

Opfactions reset

gewoon_jayden2.0

Epic Pika
Joined
Jul 27, 2019
Messages
132
Points
39
the reset of opfactions need to be earlier because there are no f top payments anymore actualy u needed to make the reset after all the payments where over
 

Luppeeyz

Banned
Joined
Jul 10, 2018
Messages
1,572
Points
124
the reset of opfactions need to be earlier because there are no f top payments anymore actualy u needed to make the reset after all the payments where over
I don't even get a single word you meant above,
f top payments rewards already ended after season 1 ended (90 days). And it's just got reset for like 3 months ago and you're already asking for another reset...
 

UpperGround

Great Reporter
Joined
Jul 1, 2019
Messages
13,103
Points
300
I don't even get a single word you meant above,
f top payments rewards already ended after season 1 ended (90 days). And it's just got reset for like 3 months ago and you're already asking for another reset...
LOL
at least creative won't reset for a while
 
OP
OP
gewoon_jayden2.0

gewoon_jayden2.0

Epic Pika
Joined
Jul 27, 2019
Messages
132
Points
39
I don't even get a single word you meant above,
f top payments rewards already ended after season 1 ended (90 days). And it's just got reset for like 3 months ago and you're already asking for another reset...
LOL no 1 likes it like no 1 because the fking staff is lazy to reset opfac it is already 6 months ago go reset it because its trash asf and get staff every hacker ont he serv wont get banned
 

Luppeeyz

Banned
Joined
Jul 10, 2018
Messages
1,572
Points
124
LOL no 1 likes it like no 1 because the fking staff is lazy to reset opfac it is already 6 months ago go reset it because its trash asf and get staff every hacker ont he serv wont get banned
Are you dumb in the head? If you think so then what about Creative, KitPvP, CSB? They got reset WAY WAY sooner than OPFactions and dead already, but do staff reset it again?
Absolutely not, beside that, asking for Reset is just a waste of time cuz the owners aren't going to give YOU ANY CHANCES to give out opinion.
 

Axteroid

Configurator
Developer
Joined
Mar 22, 2017
Messages
4,123
Points
235
if they reset now it would be boring, they need new kind of plugin that gives people some kind of task and the reason to play, it would be stupid of them if they reset mid season. (also ik memefair got boosted by alts)
 

Luppeeyz

Banned
Joined
Jul 10, 2018
Messages
1,572
Points
124
lupeeyz replies are so fcking stupid holy sht
Morning,
And your reponds are so damn fcking toxic, either way.
And feel free to read this and make up your mind and think about who you are

According to research on computer-mediated communication by Lincoln Dahlberg and others, trolls were originally understood as mischievous tricksters trying to be annoying or disruptive. In early Internet culture trolls created false personas to integrate into an online community and ultimately derail group conversations. Since then, however, the term has become a catchall to describe any sort of antisocial or disruptive online behavior. A term now heavily used by reporters, “trolling” is frequently used interchangeably to refer to bullying and hate speech, muddying the waters around the word’s definition and descriptive power. As a catchall media label, “trolling” invokes a kind of nebulous Internet folk devil rather than an actual person or persons behind the computer screen. It obscures the underlying hate speech. If observers were to shift away from such uses of “troll” and “trolling,” they could actually name specific toxic behaviors the sexism, racism, homophobia, transphobia, that they actually represent.

Toxic behavior is pervasive in every online environment. Maeve Duggan’s “Online Harassment,” a study released by Pew Research in 2014, leads with the finding that 40% of internet users have faced harassment and 73% of users have seen others get harassed. Although physical threats were only witnessed by a quarter of respondents in this study and only 8% said they were physically threatened, these numbers misrepresent experiences online. Seven in ten of Internet users aged 18 to 24 have been harassed while online; and 26% of women in that age group report being stalked online. Such statistics provide a first glimpse at the scale of the problem of the toxic online environments, and they show that common practices of community self-selection fail to address harassing online behaviors.

Recent research shows that toxicity also exists across online gaming groups, and is not isolated to a particular game or specific player community. Alexis Pulos’ research finds that player posts to online forums like the World of Warcraft player community often create a culture of hostility toward gay, lesbian, bisexual, and transgender people. Similarly, Kishonna Gray’s ethnography of the Xbox Live gaming community reveals a constant barrage of gendered and racially motivated harassment directed at women of color who opt to communicate with teammates via voice chat. Problems are worsened by gaming community leaders who claim that gender-based harassment is a “non-issue” and dismiss their responsibility for fostering rape cultures. As these evasions show, the industry will likely be resistant to change unless external pressure is applied. Yet unless hostile online behaviors are reduced, vulnerable people, marginalized groups, and the public generally will all be further harmed.

Recent efforts to understand and respond to such pervasive toxicity include a 2015 panel on online harassment convened by Caroline Sinders at South by Southwest; the 2017 workshop on Abusive Language Online held at the annual conference of the Association for Computational Linguistics; and the “Notoriously Toxic” project funded by the National Endowment for the Humanities that brought together a working group of game developers, legal experts, social scientists, computational linguistics, and humanists. Also relevant are experiments like Google Jigsaw’s Perspective, which attempts to use a machine learning classifier to classify text strings on a scale from “very toxic” to “very healthy.”
 

BestPikaPlayer

Great Reporter
Joined
Jan 12, 2019
Messages
417
Points
40
Morning,
And your reponds are so damn fcking toxic, either way.
And feel free to read this and make up your mind and think about who you are

According to research on computer-mediated communication by Lincoln Dahlberg and others, trolls were originally understood as mischievous tricksters trying to be annoying or disruptive. In early Internet culture trolls created false personas to integrate into an online community and ultimately derail group conversations. Since then, however, the term has become a catchall to describe any sort of antisocial or disruptive online behavior. A term now heavily used by reporters, “trolling” is frequently used interchangeably to refer to bullying and hate speech, muddying the waters around the word’s definition and descriptive power. As a catchall media label, “trolling” invokes a kind of nebulous Internet folk devil rather than an actual person or persons behind the computer screen. It obscures the underlying hate speech. If observers were to shift away from such uses of “troll” and “trolling,” they could actually name specific toxic behaviors the sexism, racism, homophobia, transphobia, that they actually represent.

Toxic behavior is pervasive in every online environment. Maeve Duggan’s “Online Harassment,” a study released by Pew Research in 2014, leads with the finding that 40% of internet users have faced harassment and 73% of users have seen others get harassed. Although physical threats were only witnessed by a quarter of respondents in this study and only 8% said they were physically threatened, these numbers misrepresent experiences online. Seven in ten of Internet users aged 18 to 24 have been harassed while online; and 26% of women in that age group report being stalked online. Such statistics provide a first glimpse at the scale of the problem of the toxic online environments, and they show that common practices of community self-selection fail to address harassing online behaviors.

Recent research shows that toxicity also exists across online gaming groups, and is not isolated to a particular game or specific player community. Alexis Pulos’ research finds that player posts to online forums like the World of Warcraft player community often create a culture of hostility toward gay, lesbian, bisexual, and transgender people. Similarly, Kishonna Gray’s ethnography of the Xbox Live gaming community reveals a constant barrage of gendered and racially motivated harassment directed at women of color who opt to communicate with teammates via voice chat. Problems are worsened by gaming community leaders who claim that gender-based harassment is a “non-issue” and dismiss their responsibility for fostering rape cultures. As these evasions show, the industry will likely be resistant to change unless external pressure is applied. Yet unless hostile online behaviors are reduced, vulnerable people, marginalized groups, and the public generally will all be further harmed.

Recent efforts to understand and respond to such pervasive toxicity include a 2015 panel on online harassment convened by Caroline Sinders at South by Southwest; the 2017 workshop on Abusive Language Online held at the annual conference of the Association for Computational Linguistics; and the “Notoriously Toxic” project funded by the National Endowment for the Humanities that brought together a working group of game developers, legal experts, social scientists, computational linguistics, and humanists. Also relevant are experiments like Google Jigsaw’s Perspective, which attempts to use a machine learning classifier to classify text strings on a scale from “very toxic” to “very healthy.”
You are toxic.
 

TheSavior

Banned
Joined
Aug 7, 2019
Messages
244
Points
18
Morning,
And your reponds are so damn fcking toxic, either way.
And feel free to read this and make up your mind and think about who you are

According to research on computer-mediated communication by Lincoln Dahlberg and others, trolls were originally understood as mischievous tricksters trying to be annoying or disruptive. In early Internet culture trolls created false personas to integrate into an online community and ultimately derail group conversations. Since then, however, the term has become a catchall to describe any sort of antisocial or disruptive online behavior. A term now heavily used by reporters, “trolling” is frequently used interchangeably to refer to bullying and hate speech, muddying the waters around the word’s definition and descriptive power. As a catchall media label, “trolling” invokes a kind of nebulous Internet folk devil rather than an actual person or persons behind the computer screen. It obscures the underlying hate speech. If observers were to shift away from such uses of “troll” and “trolling,” they could actually name specific toxic behaviors the sexism, racism, homophobia, transphobia, that they actually represent.

Toxic behavior is pervasive in every online environment. Maeve Duggan’s “Online Harassment,” a study released by Pew Research in 2014, leads with the finding that 40% of internet users have faced harassment and 73% of users have seen others get harassed. Although physical threats were only witnessed by a quarter of respondents in this study and only 8% said they were physically threatened, these numbers misrepresent experiences online. Seven in ten of Internet users aged 18 to 24 have been harassed while online; and 26% of women in that age group report being stalked online. Such statistics provide a first glimpse at the scale of the problem of the toxic online environments, and they show that common practices of community self-selection fail to address harassing online behaviors.

Recent research shows that toxicity also exists across online gaming groups, and is not isolated to a particular game or specific player community. Alexis Pulos’ research finds that player posts to online forums like the World of Warcraft player community often create a culture of hostility toward gay, lesbian, bisexual, and transgender people. Similarly, Kishonna Gray’s ethnography of the Xbox Live gaming community reveals a constant barrage of gendered and racially motivated harassment directed at women of color who opt to communicate with teammates via voice chat. Problems are worsened by gaming community leaders who claim that gender-based harassment is a “non-issue” and dismiss their responsibility for fostering rape cultures. As these evasions show, the industry will likely be resistant to change unless external pressure is applied. Yet unless hostile online behaviors are reduced, vulnerable people, marginalized groups, and the public generally will all be further harmed.

Recent efforts to understand and respond to such pervasive toxicity include a 2015 panel on online harassment convened by Caroline Sinders at South by Southwest; the 2017 workshop on Abusive Language Online held at the annual conference of the Association for Computational Linguistics; and the “Notoriously Toxic” project funded by the National Endowment for the Humanities that brought together a working group of game developers, legal experts, social scientists, computational linguistics, and humanists. Also relevant are experiments like Google Jigsaw’s Perspective, which attempts to use a machine learning classifier to classify text strings on a scale from “very toxic” to “very healthy.”

how much fking time do u have in ur hands to do pikanetwork forum replies. jesus ur life is sad.
 
Top